var/home/core/zuul-output/0000755000175000017500000000000015154334145014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015154341743015500 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000307612715154341563020275 0ustar corecoresñikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9GfͅR6᳂I_翪|mvſFެxۻf+ovpZj>?xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷jכ̶]/8 N Rm(of`\r\L>{Jm 0{vR̍>dQQ.aLk~g\UlxDJfw6xi1U2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2i]$HU^L_cZ_:F9TJJ{,mvgL;: ԓ$a;ɾ7lַ;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_jiB8.^s?Hs,&,#zd4XBu!.F"`a"BD) ᧁQZ-D\h]Q!]Z8HGU=y&|'oZƧe7ΣԟRxxXԨkJ[8 ";ЗH F=y܇sθm@%*'9qvD]9X&;cɻs0I٘]_fy tt('/V/TB/ap+V9g%$P[4D2L'1bЛ]\s΍ic-ܕ4+ޥ^.w[A9/vb֜}>| TXNrdTs>RDPhإek-*듌D[5l2_nH[׫yTNʹ<ws~^B.Ǔg'AS'E`hmsJU # DuT%ZPt_WďPv`9 C|mRj)CMitmu׀s `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞJ|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I? xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~*ſ/,e?IsoSrm_7dPΣ|ͣn/𚃚p9w#z A7yTJ$KOL-aP+;;%+_6'Sr|@2nQ{aK|bjܒ^o(מO80$QxBcXE ء\G=~j{Mܚ: hLT!uP_T{G7C]Ch',ެJG~Jc{xt zܳ'鮱iX%x/QOݸ}S^vv^2M!.xR0I(P 'fΑQ)ۢWP Pe>F=>l |fͨ3|'_iMcĚIdo阊;md^6%rd9#_v2:Y`&US tDkQ;>" ء:9_))wF|;~(XA PLjy*#etĨB$"xㄡʪMc~)j 1駭~բ>XiN .U轋RQ'Vt3,F3,#Y3,kJ3,LhVnKauomˠ_>2h-/ ђ(9Uq EmFjq1jX]DןR24d c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*Fn) u{Hk|v;tCl2m s]-$zQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjﺧK]0/k<'dzM2dk–flE]_vE P / څZg`9r| 5W;`.4&XkĴp 6l0Cз5O[{B-bC\/`m(9A< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBp_uFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȗkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0=HH3n@.>C@{GP 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙TީXfnOFg㧤[Lo)[fLPBRB+x7{{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[, qnriY]3_P${,<\V}7T g6Zapto}PhS/b&X0$Ba{a`W%ATevoYFF"4En.O@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ąm\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8]iSCQ&s~In/SZ % 'I Ƿ$M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt_MGiq'N~|z`|mu}r:"KiyGҪ$& hw#4qn?ܶХfm_Ov^ܶ[6j3ZN9t9ZMMM)I[Rχ/C|W䳮yI3MڼH9iEG&V 'x`u.̀ab7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S_;I/d0oIU:m/~[*K1QA="D:V&f:{7N>^uU` c/X)mS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҮ|AI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p!;_V>&M!s}FDͳ֧0O*Vr/tdQu!4YhdqT nXeb|Ivż7>! &ĊL:}3*8&6f5 %>~R݄}WgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R$W>uwml"Ms>\΋"?|NKfֱn !`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r-JAT6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁV_+mY/%x:ƋM؇d7%y ٞ~HJۭ8b&ovX]]GwLQ4qx%2Vd[ jSt;͗̏U .cՅ'$0V(+ ޑeQƕ>66AcUJ&\.!$xF UASVٚ:ΊW<)jDƪ05 ǛѪAr,|2.q<(G3E5EZS"/q&xOo}s7c@<82,o<%e'FtJN؟a~dqm|o6|2P-G+"˓5 ;z8y#C.,ǐvdPLÏl%Ba9% 5z#hҚir4Yu{YъgLwsބi$_ϱAw%òބ6cߵjĈI&6<}s|Dֿ<&"+ҽ8vu8b+-<ӏiW[p@دI?_bC`}c|?MMHz(Q9Klq(lxIkyV^LfU%Q _óGF8MbɪE372Cfz}Ia^@PO,iq6*n0Qiׇo (,CkYe Ge5J}fDʛOIsUl3]׭wQqIoϻWϝNc`6wR;m ҾXrϳ%4;^CT?M^sפ>)n yCB+(뛼]w$$l$M-O;U2-/w(l}@3Kb/ 0"Uȋ6q]_yt}lBnPaHP,ꎂƟħL=_s<#ismB8#\`8.o.ٙl]ϧ~>&'ރEߦY| 2I&}ljrl T*5t O<_9v ~F{m+׿/ƷS&^uC % &?$?Zg:d>^9d3:3pUkEjSMd\sʧڴ& <]4%|p#apW?.t\rc׳bޝ kBR75[3/eO.in`EeRECK.%ORlt t"lyXGO>?%ap~ 3ogI<}<%Kj%!b4<x1qom𚒜`D=B T/;ϼ+}˺?hk:5bBNa2zr)e_}e!9aNÇw}9Cv4?ee3^/h#zA7l|'dxřApR?]?BH|VqlؠWɓzq0`}R$ԇd*dSG1L09s#n߲H/f+m%@eTX縬ȱO]0. ?vi\4>G(5X7RhMVHAҰN<,Ҽ4 x|VE"]--Ӹ>>2tMP NA̴,(IEih<|dzECbj$y/o-ͅ<>:In]yЈozj>%ъ!vGdXT8*yŢ=}|4+"w!Ui[jA` _`4j7B[0BOQyJ9 Mdtbqv5_E~غ (RezC.qڏ[4@΀p\ou ])n/z4#uyo[= K i,9lG@ols +-㩬 fxÌFA cϬh8LᆅEQgm ǹhcf8&vܮ4vZZ)1|vmCDq:apռ MHkUsl}'B| J[p\Vw"%7?o=$zvT']?O͔xy6ZuzM4GTR^s4e5@bQ Ha GBFIc}Wn^ PJ`XÛ$jyRaNxRw[`&)V<wrf mGطʳg}FYj۸ր0 om]m=lZi-gy)R-`ڞNqL >4a܅bu eKލ=Uʹ+iXWz|`X֢R }jh.gELvuuy#ؾɖ3CmZup:-TrM8oҜZ$U#ߕTbz}\p+n"*"e#ޟL2Ӎ^Yu4gq ,ozڪ)Q|UtXMRii[uU ӾZWDJ}<<;-_dwWePmʢa hUӚ 8^ZlUq0źvRFl]h'|# #fFt}AgO7v<8ϪW@ͺwa EՑƴ;_qYN44Xu D6RiMI"*Yɪ֚2Y1N cEO}22Og;QXXr+UF xq4†%B)C w_Ԍ?k@XM=%<AmS7Ur[J6|sjlj >mUm{Wx.^y"zwl oZXt=}?ɫX0'ۅ]ߚIC1P9wtBZnUuTK<ժ+\*IMթsa<=识ߵR4s9B[S]xX.ZpqZcJ~qbl'a2EqװLrw-RVӢ!f=v$yTre iJ}{|;o(H㥪G ?}RWp}[U v'*Mg*__gg0iaw6GHՃw/ 0tTuJa+b;r4 &IH)P*eU>7V~@GUlQ[e:siTOV9|HW؏{ 㨌)nz27&V-J!39f@{f/t*sU­ ʳx R\_`fؐ96u=}#Ley_=<~ӻi 6*lT=WuN7*{V5qnz.3$a]ipE**]GDZ !UO"O7ʼ>_Azke~A sEG&gڭ}P&Q0]s 3lt\2ݲ^~miI`tXA&\1<Ɛ鄱gC0L>c:it}z20 BG]A( 9p~f[/64,faox3#`VK 仡G40`\.r ֑0 auiz2늙֋*b&aЃIC6h8fj3]ތKf΁ ga>ְ3\eBpEÍbaXcǙEXug1XEENCX]gsCwn3=Mi>rq2KߐxCZH)H Lc١$ <{ ̶ClǦFkEC2܋ٞ~Pcvw@CЙܗlR enuơ̂,[cw-{7E%lnf LL{ %:QǖLn0CR2HjX]ؖ(;|4/kҴмlMZ(W%toх2M%I: y:Co \[. ܠ[(\ +E(k<練_ާ%UD&Ϛ[KeY!Nq'Po$_>mUh>ސE]4/Ύ/ůGǯ8:i'_8\wK ts䧩{1ʹ k|lqxONtz fGy$&9#G9bf*ءQϼ7䅢hf`Ab=&y]z%4o`n811;K &$SF$!/RX V'k1 q3Mh0`=EeI&<DZF5P'Es鹥KEdLo!% .DtmE[tit`bJ ]w`I j]UTO6U9K}ɂSGř)I~N%a~ud/!?4{8]LRVb}yÇW nMK㸗X^s VcpŽ)ܦ#eGi=홴g@U{z `f:?4{ PM:4-u6OZ%(qs_ D칌Edia0LGDW\\ԝelY䛵Z/kX`v-?Vq" rZU[+:O\[ .T`OUPKyҵibRc$P6YG:Z8#U^b }vjh/,r'29wi:N/,OUt4#9:}E'sr ~?`wkx钟^`ؖIxw¤=Q9 Ьc\+xЇuD3΢ϷM|#2h Ԗ/m)MZCmy>Ohk$' V=WO?"~({ј Tp!R9ʜXg?g|54jrP4r`#z@U]J})W1u[ LnX:5ʋ+#&^ݰ/ڏ4Q ,rgG IdYyb)2Q_(Ϧ`nd+YmZѵfQ̍!}>eiY׉`KN(t]~ ƧaO[ڑͧO<=K&jD)t2{&Sq֛LW+pIaOy1|Ԋa4|"iG{m'i-틭Vbp ?Mڶ|Oh4-GXG\Q7u- d9j+RY`m@\uަmxPHM_~/U>B2t BJw$nCʐmƔ0JwVոV#>wlwmFZڂXk{ZjH2j?P{GB-u u ٞPy:;lAL#ݑPw BeB-'{ގz[/oA=w$߂`` B Gh#˄[nOhWNK]yƯ,gYU1H*F~~9$ U5-jn$p r]p~~EgTCSɯZӫ,sEK.8AӲHX"VC[Ǚ~"K t1tzJ FRAm-ZԌQ=` QD)KOa`h.UmtKC_Yf hE5$w0[DMჁwoD4 }I䭁qAӯ LЖehMo؄~?0+횧ܛpELshaH9Zsƣ@eL9R 跗Uãf,/@{n;т5Mra>*,~&p6 ]4~aLp%Oܷqծx{?`)?L6zVQ;P7H9H< R[miRnI!S:Kv4;n =qPVj*W "H:CoY~v(dt:4nbj ),N}'Bb0d&D5K m8]jlxgC8dBAoӾsM\TN,S?H]<H/ e<;L Uј/nUL?g+]NA>`ģe˪p9KexMD޲=z剮ι Hx'"No (])Ì15< }+.)u=7MW41{4xqJ?O@IGU1<*!j~?~ˮXAhZWuH+WAF/pQxBgUGC8%|zУqM$y☩Lyېxr,KO<Ly60{ZR^ ^ɦ"k#)YKZ6|Ti S: kуdURezp'tZwqb0ETD ʓxjqOY,Ճd 5v߲ic|OI40`C}%HGfOL2LPC_pt>;4l _\ޗ(T?5JRG .yЅJ:3'!Vݳ,Oy@AuT:Ii<PI4W6{u_*{@h. O.$MYQMR~kp䬏ȵ<]z֑->hc.v(x1zȀU`+a>(FWp011D1bXBu$ϐkj7nWXpž%dCxyÅey s6Xf )iY?Wsg.T)`/動܄!}]D@hѲhݤ4yJrӬ970)S(OKMD'(SkQH/ƃPŔ\P+/pp"FXuuQl*S\Ywzsַ(^R;㈄=Ghp}htHQL*!јE!c1P!hcƦT!ʴI79=xw<wD0MGI`1KF !m @`67gGD'-$FろSjPĘL쁟J2Gޠ\lPַoq(qq}徎jwef+]7vxYl=L"th܂k5U'-H=vQX>FEkq#"TSq42&[. (RJؔQIfp|i}O-Zn,i!tк9Md0*P(b˥vN9+c`᷆"r% $ Ќ0Ev7n̂Ib([x7t(JqhJH% PO!15U-4V><©QQ !Dٯ1xObTu_T9 V2e/{?R$1r[Ur]7-* JE ҌEҙdQ2~GI< O,8LTY` h0ѢGu[et22ECv*p&'z'| + B"Iܕ*'N\Bٚ0`  E [k%h;MU5 Z`h"iճ_UhH6 nV_nGGvPJe)Y7W WgH;OF0N0E s[i}hDG٪*&[Ў4bG \d e]^$E*"&Tګ^\hQ$X5BHKMn4 h%#I֢, IG1E j3Jo5I.UvJ"f*c<e-s VMO:o[|Ȃ½N1cT"@"-K`R$ᘻ, MBR ]wFJ A3§,8҈*t@ J^/##Eh6+}6'O%+ߎ/1.rP\&65 tĂdO|r)`*y4^bDlLdZCxM., /JV#RY@L@/Xҙ>o<o)#qPfmFLT3)KQZnngd@,12’Dv`!؆fh)4q]: avPU ԟ\4<,x/;ĈG`qX]>3U8 *8{mY]cWU;Քk:Je&wQKůׄn 昴Hytߕ0?X1cb8ңr-/E Q*E(A+()b{䨧h&ш"D9g(ibCtQpd5;,KՍ؟WfAȹb.8?:FeH7oؑ<8>cdg hx2Gw޲ɴ}b#%9tˁQs1 \ZD M -LSİ6Ƙ;:])S N?cnf`&obb}APeK"9eGWq҄8F&Q<bF~uTtVS)Μݍ}8ϵ|H„1XÌ{$Htmv,889Y M 0V9]ˆcI|Bf)y*`4(!S&^2Aç% 8UʦV;SíV)R5F(.ɪ: ==}K鿎VDN[⟾ɫ+8q# 8LɅK%'R%lTzK No6Lc n(t{$I\>aK ز⡐bwmqe\+_f=<EY8цDLD%$VF35Z0??4^>q~o{}9?G{zR2[bP6Yu/*fiy7>ej/7:vIFZ;3*W\= ҄u&dJ-;4F=| IǴXkln76,@Nl&4Nh=2߳+^$x7"-QƟcAɫ vPJCgSe~>8yQ#I>iw.!QP;DCys"mT79M4<̢Do/YpL.\yprpjWu_I>w<;\;rҡ$D[[PBJ]J#H:s!7ݹ;w3qJg_EisU')"bhUsfM(/mk]< N>p8st­a0 m;մʀޣR6cƕ,N:LX Z\:3s]ot9dprӒ-eћJiR%sQ$%$O[ςdZKWdIwftt68azlGkrR4PXz.,vn~j}^h#Bm#Uݭ8֔Qz<юu x;4lH, N&|rb(L$`@^2N/?iꗎd̔Z.ʂI&KiTND Y[oXڢfKQ'ƐҌkGS.Ϻ~ag U hR{^89aJׄy `rҡS6$:eK|^)4>J&_?!7)q$(R\J2+O!kɚ ɼ Mw_ۯQSdC|jE@#/iLyp{\UWqBIaȈTuTHO.ɇM*)MǀhV =Ӕ1!10G[t?Z{c;yNғ?:;v'.?B-aчw{ br̥o'k3Byd0N^׺akn2]wn?,!S!gH7,%nKcJ8s6Oe@9aRLIF0yPf e]oZD I2.X']z95#Ndrsar0.K/jG,W ZDQVêJA: 09B(ZG iXbaX% >bԾX\Ʈ|F08̸Q*HM`1&NiE<΁/u3 DbY:q2cV@ &#2 ھ IIg _?WWwv\1[ ަ[\d'3]ʝyg2GVN&)oYi'1s+T)xY\'Ibw[#9a e*1:m9;3RS'S_8"~G&UoUENY 1 Am]{f»%L>_ I oVj>1V T){dqt_p`jy]Gǃ:1 w@ih0 f;erսx\%ѓ/%O9pdFkmvs0"vͶ)De+JI I[,[)KmM"9|Ι\f8Zb>ȋ \3O_%e(Z5.롩_JoY0mT -yڛ<=ĤH2q?,f9: ڨz;%3q>Z´nl{2Sje䓠sK_R^qϥK{$Tk.v7-%$jǫ14c%X6q/ξ㋬cl(̓S@PqG)N8Zwwz֍u`3=sz=J= ^[{N_2" 7~Yoم0oª` .W'eJ'x_6-m ZbWڝ«`O|yFa42$('҉>ミ7Eq.0%%U h8Ld|P N(#d/ zGqUF$o@~Z6~)#w,B.Dσ >{7 Gobb+ O@4S*O"4 =On`~SZue{5 ܼgyӓQ񪎖8BXQGd)}C%h2\eifȔ_LQ^ɐ1K_C9a D%raBYSW'! fg뷡Y~~9NL(y36iw*N0ȊMya ;eĭ j kI 4^IvgJ¼|xuڷ0hXOQ(83Cp*hXnqK2`d>#.Ad \ҵX!Y5;fWf|q;lFֆwXQlnNZ&8 GW,7W^M G{;p㈄E©_XsmgW&O_hKSs~QS%צ(8킇h/Ռ)6V#2䤗UAȗaq z4ߌWyo7jng?.zZC_zHFP[uc[V"v^9[O`zn`l-i ]0Zxy2e 4#+ ?JRAJK&=އx)T%-H5ݢ -HBK`_5ib6i%+G]TnSu" T]-en4ݱޜ5,kYE]ȁl41GYI): JuԠ4P1Ɲ`M(5S{@a7&SYnDv oubk u(qUwL!p1¤;B A+NzՁ@^Q4wG 7N#\95r3k5&S,KgT#Y@XH90ĕq⦄›OqKk-1FEbϦc]X X60Ck,;CIIZF֒}?NݼP-a w|<\.cd^ŊNoMVt6,V%4(F(Ex@:DMҒ_(oo9ߧ,& de5|ݬRO/+e_<3)"=_g#@ߤ%Y"V;r'5y37eW57[B-<=I}-k35-hBϿvHoX &;fD|SSb"Hޫ tp+,~s},Xʲje{Kwݳa"0"rbrKI]s,iZaRPMw%w 0^P!g8O&e --k~-=u iGYChE"EBK1n|<`= DiZZ02J{5|eo m,kk66.MJmi1Y)1 駍6\{6"6lз-1’\ ]Ϲ,,|IfU>si))DYQ[-5dŲL|ʍ_rf B4C4:iǬm Ep/sp$6AA2V/ͦH /wN$iҽ/u$bil{%v$ X^+_jS 0tae*%Qo5q 95mp]C0ǗWRz~/1GݸwwL/sUtfBe^nk; ME' )TRsJ+Z8Yi2HI۾)bt#U(`+-VD=v,{ܕuCb˔z2./wwQ5O#Up?I_*y?SOX/](bf?" Q5ݺDx̟C~8=9/`]mF|bw@Thb1)B,z/?pg)WR%- g)Vg4Eg~QZFrg)V,R4)O}0F87fP<|e?V5L[,mƉH߅#|m%_./@?JɊSZJ R p;U pG \Q`i4Uu~y|-%'3!뭉#I{0LEgN"sPXrn໳k‚P ( )sfZD40bHH$%Yl JE;K2xVSR`tNcs_k#*-e"\D,H#.P%!,^*ɐV*!y&Xbԙ)KhW&-5oUw& \Pӓӛi':Mw3dpIYPᓔpi& 6렝!6|6Z=V>gE h:(5)D߁3U"-!%l9eDb\8ݫՌ"#5C!!jFDbaRH:r\yKfh.Ҝe֠JJT "-H4Hu<k S@5@6 ."QRF%8NGM+g,BNN>0%\Hh ŨD 9CEsH[x:耧Nqw;ܦ -}cBp@r A]9ouĘbFGFrHpn&Y1 ϕq#qYȇD)^:eZGqj@>J 05bXQܞ0L{k ,@:RQh #8~KCNTUXE)x{19ۃ]|$q;6 F'NZ;.:hQ 5uJ #"6h瀄F 5ÝU(@!C]LC^08J(bHЄjJadk=>قĪ#=;OjEB3x'%.;^2XrL>m\, 62dzr1Y&B.-6APͱ!F?ɽ"kghD8 S1@;ITʹ""n"IL`jS5AsJi0IlChC*11`( EH$޵q$2 p+Q~;dkl l6 ymUYbw<S;,Qz5:G5xh15_/Nn܌l܌T r3´" }k#*e J6YRIj vY#l5Pbm5PJ"g0K&-jdG$#R,5)8i[<1h^ea!婬 ~/*U۲*ao"+2aFY!Ҥ7%sm3΢=J3-$T UV쌃D cUy &T䒔FKQfv%=*3i~2 VmJH3[*kEI(R)&hU;;3goi5~k#ڲ3K$U=Le61--wpl$\C2,$Jk FQ>qy-Q@LP2JHmƫLSDSAL(T.3M,/r`3RK.LP0O>lppy?-P7Zh`[c1Z p(jerkV M6 n}x=z_WoOdYF\gHw8&ch Ėi֢k9u 4ڧX;َ8҃q|/ ЦM^@Z&/pgN^t- Q# N> DMCLú^kcm94ONNºu ಞeYY9"׻9|xOKN6FɃlf\v4~?jY*ӧL)P¥>\Yϫ&yYe<{Y7ڦܽ!L <('s\l\ܼI ޢw'J4Ogs璓^:=l:3~[+*2KT$'˕R+T X_sP\mz(Ǝ O5όk t:'YX0E:-z:XD483ΜGpoף1Wˣ`\XkB'.5Gn_bQA+#\c9["oq/65#bMBLA8^y=F J̸Ƈ;iaQyu>H~&qR2'Ɯ,ZTzfNA};~tЮ)Lg=cjENֻv\u>`sI&߃0U <038qQM}<z^lPB>ǽe ݆teFh%T<?k^ޣOcn|dvW/Lo^v6 Bgx:0 r/,OBx~{M\nSoaGw"<_oz/娸cA{=.nrRiC6nm69J҂XϸGDq_eQ7ԫX6,ӝi[\I6o&W/z' xҝ>jVn6̞҇aВ 2mrO 2W9uLVdMJRj^XXP Ca6/ TVTXcR$G6nEe0m7JmS3y٩XE(9^G mJuw1X.ھ}֨-%.FȜbd1)X&޺ْ`>,-6GHA[qm=adYj*ҜB`TBd."l/VfLȵJGH;.S! ёX00Kō?H"r\BEG}x\?Ɠ7y=d nv~vL? /JA@y1ҳ^ 7P$wi_U5qU%Lw7B/6u&wMudNqz $g0Ʉ#O JQ059p9\+JG AX`R6)n47AaJS[¬PaPCeI~p)p)}у+Rš6c5恫1v[bPV30DYVUQ4R6uֽ/R>ĖFT G!pE 104z̉ܪ7 GCMz~A;njuP*!} wPsWw-}yHk7CVkz)M#y ۷V|yP2P^9[*"l@%n>휆"7 =_"zskNuT4;*D=ŜE'V&tplK4Uv/gC9M 6uK`Jfᰁ]`G2 ^b_G8(zgTYp=I:Pʩ90Ei|DCXӳ(lu`ȾMͺ2BPȎRr00!`ϤX;2\Ր`RNUՂ-AN"S  wYyvNW, A궄 xL G3&a,pb}IK(VClo:]ѐxK(nrC׭@69к1 ل#`j%а$h| X%H2a fK(89%_yBqFn?uw{.K8N/=Naqz@Z||Ӈ[:*H2Q4FE^ 7\Y#uˈ "ԆE'Aֱe86. 1dy\w q#Mլa[|M-pn{w4ki"-ef3m)Aܺ;T+JLeHjmPyg; <(K)+ooWT+8 &6M~@ kBلq){6Xw{Q5/֤t"FAA>j(лMvx4`[nwjbn㻛MlqbMlc&9njk"4kއmq"^R/$i~3S\<ӅEfJdI%p3YA4TasVyA](OCdq(46$`q7}h]8eDa%M,Iˉ``rXt'l"BJo#':\zYj8G>pU"FؾjU=/zTv{O2k= Jc1c`Jjth8<6 wXמkϏ ׺%Ԋ.oߺYև ,ݔt !q2tSҺ1ӇblsaB+L!t^8RQC+ZIi|)nDLCin1 rPUC',?硬ۈOVI7;kB%500ӑms\vdx*He`}*)t δw0mbN;YOwȚC 5-4uJ(\Mw˱QRE!.HU[P *FR[("Vq2%t q1d].4N?1Wr=Srf&m|YwZ#D Q!a`ڧI}8BVp VvSmgt57%4k]_ʹtE+ uAQ́1[ Xq]ZSIPG ){r;Lnf!~u [IxU]Zk&_T,7f&Zp*ВP1t=4Dp۝ۂׇӎɹp:Gu 3H#= 3^Lad Ayp. l. u1hǺណqK=:XǺNhq4FRV$]UXu{kVC5m^Ů*x[S,9j=Y9>& %r}lTM .#q XU((v*f8%I*Qd/-30d ӕtLWB ӕ $Jiίcɥc꘮t%iI%t~RAy;-]m(aD8 L8F<WF+zjv7pͼ x0 r-}՟+fesAVҿWpa^}9-nӪ[jqMK+Z3&nj={J5U!q4jY.S۸z@}v'X}0>x>GTέF6 ~lXh NzӛUy1HGw{k;5+g(J.E7+Im6Iʩ9Zu"O,3Ԁi΀>6izuV^7>睶Ì""~[T!< \L|+*hf0N?V;i'>)5׷~ާ乲!]Ϣ=׏t~~6B{+byk~ Viڽ$.Ig-FdįCO;:7+}=>iu5s?o^E0ac@,e*T\}(Q]yXG?>JW͝?u'mtOW?7f[mFmsM߸_asv:ѼH$@dڅa$RUP2&d>\xamBsݝUTإXwIrb&RNJsSj1܁Uy 2?D׋/Be;:[nH7COhov$ rHM*3!CA$q$ $9+ȭcpڤBJL_Q ODiSRO?tR|&U8!"z6%Ć5HC񟯮> :Zilk~5ݪ&/ :mkoq[H, FoՐ(Iۿg5[ɚN(U$E09gTj--.ՔTky,iJGS*eFrTȁUB5URR RouGx2ZRAX-ofyz[mIk5oGyEHNyR*z#q)B xW=oyE SLĺoL (zCzޣ!9A;Q@BT /Uk؆y=BTf*1U/l Rk{d{Yl>-ʶpu&ͳxI`Co#JCG Sr/ &v.7߻xz=/pg->״Ey^o}uws]厾OwM%n#avbSВDkchhaB&wk-8Ko鿞},G=%VslkcAs2y\SoEp˹ⶹ}3Mnۦ;W&v\l[lh;;>'ю{ӮK] ^"I/C8 XDJ琝\,Hf=_q7iÊD0l IKd2`5XyДFʫ`ԷN)-+Yc%(4Y@%:ۛ3Fe^3C# :V:J\H#>RQ@jN0A^Q0D8K#.j|bkdl'g\ 1|k=!|bF B]1ȵ-t> v>zV}ף~;MPӓ%z74]o\8=O7l՘1&K4b)auYRmQZ/zN% Vqu#1v]ڛsR 6dT("}cNvL@2p_&{慽:b!. G\rj𦸹u}[;N[Sƪ%G' H__X,&,kuY̳a|Biu \b@LVhIVc%J\B)dl ! ŚEBonץA/Zj`?1'<PO',90OC'&Uƒdqd6H9aGR{#{#B F]o<-!V7b1Pz iwu98! a6H9Bkj%\8P3ggn ~u}];);3qW–%>{j8X `>FS ؋obpr '幑 @ pc#/V~:6R!iX657y6g6H1Pfk66ga3?3,q&xu`b/A,RK,y%фXY,K=aYl{i%sX28M99 KBeɛܞw^֫:bwk1qv}SB?2.F@b+!8A_2m썠4C77mq$a (cȢlr(礨ʴ%(;;`{xKaS#4X$79|R_uk/}c*_ŗW-Q~eZ9cC,J ]Z+]ޞ.{{O3W#fkƠX5I0v/nh#upڽ>go$cc;WA Y}jT޼̛M94o^7RלVQ4 Q  @$m1Y}:xae|ƖCtIŪ~ARdLRB8Z`ӷ"d?\~]ZhuUF1"'c!ئlMi*{@f+"P!k#lk|bR(\J*(@1pcWP(1=(\6x?ҸlZ'Aŵi =Y>/,7חNY8֙0  $! UAAZCmz3T_j PFdTAhR==Ԏd)$*(6@[~ہbrBf-pY$AddLl;DXLLkx rau,eLVrCκeca`LNy/!r`u8ҧXaD3#ryo U"ϬX^h Tm}`FcVrDɊIlm …Iٞ.%oq~lVr.S=@eS=f !> m K63hW[eww}sOI"Gݯ2Rُ߯xv󹽺[/SO;}ӓ|7쿩x,wO$nsYrٝ:jio6bErЇ:(pL-F,c惔a2'7==V!ڦ&E,`ɎT` TO( Tl%/#ZBinelwA2$ ۴JB*m 9E$lCCO-59O HXNYM\lWB/jLp*~$1ē),t2t[*Xv% %$MihaXɐtFr$ў3!^a30#n5#Gc3#iNYҬmS9k7fh,E G-MX{EM 1њ&؃ ٌ.&.W雾+;=bdu̱2Ң`>CAt'Oj"״R.AgwXLО0lk7A#hR{ `g< X @%X>jvh#PPJRFsm %##eoOtH[ '2qGi s9RRN-w$97ڒiLOv-6uDľGjk ˮOlm}XMo6T#`G2,'_ϖU_o.]^Mr~ng}{H.=wW?5S9b.>Ư??⷏Wyzvk?I w޵q$ A" ލp,pS;Z#U C Jđa_UC50L@<8d)+\O3 k$ #ޟEAL[rV6,zTu=$ƍ/Eʣ'7r^ߞ&R!^vrfvwV#[^!FMJyx~嗷+ҟ^/Lg;rb飵?۽ϱEՂ1\epfjڽ\ýb6U!u- =-5 ϋ @suz'՝p3ktI \Fw/Y7J&ٚcȆe벾e9>@p;izR'4&'x'(҉o Zˋ^_yT;؎MGL yE6ńc?LGKߦ@_U[\$d4cȑ(cdJMz#]-|߮jmܧ%, Bĭ9Yq \%s?@m5rcT7ޟ R|Akjo[۞8  he:xCvLTm)9+eJ&=2R5ct4UcƆ&| B<O1C\lc%(S;Ky>ÓK-\EM:VvmN_9~QAq&(ٌH%""s[Q9JPWN6i?$Ci{IDjV>uC?G{OM~XGX:\.t½׭!ӒK>D"-@>4)C>!eoD!N)$i3IL6g=I%`-&I5Iy"$i3IL6 8dJƕ$SƕB07qc7qc׍ [zC)ׯX"ЁWvc_8ќ4L)8<4ݶj_kʷD٥;x`rbA}H=09!iGي!fkMG-?4 `ٍoVTn*g)ӻ`Nwގ.͌g/AB;j7p3y}8f4;ߌ^'ނi|LQo_F e i'0T @5[͕F7%IWLċ_2c/1L1To_1jZpל{s#tj9Zve ÌC.E2jt/g{p2\9_dnyv3de,*x!\'M9!O4æ i&+_Coڂ~u!ht:,sNzXv)? \u0TT5& 4%%Fxo$[bPk2U` }%?lU5*hcb6c݌>zrWm$p1"J&8K@!R,aD:+ɗ]m0 Ȥ?z^5I,lT현l]ؽIe6}ؕp.kFDh0r RbqfCY,Qe1DBt_*MAs@Z츉h6A?vr^aCHBe`ma4 #W\vB7P2~ k@a6>Eȱ1oA(>CܓyCgy5 U)9⵻s-Z;W-"34p C)2j%[^S6JCt'HI{P}ц:ڂD:\Ӝ1al5D;F4њ21p"Znq_>n߃mi *gk8QI Yƺ_Q# -GsZBD`U~p҅ ƌiÎODe!kz(L ME}إ7{]-b ʤOZP pZHC۸B뻷?۟z:Is_E Vq"LwbͬN9^̍,C;\_Ƴ `F&r"<щsp&Vl0a0'`o&h@4$MNx"EОP!|yCdU*uDw;j݄8 ]4^o) q ma4(4Mm5r'0thf."x1d]7U`b$+ƄWjV=-}m0B : hɏl1},oMSgB`@U{ŰL-~oiQ8Ět!ɫ_ucE KӪ""hL!ju*X TsS.Oʣ -ICJ 4B5MT}HE+Ũn^-tsnNJn%! ͒$jMbXBf.zs.Q*tM x*dLjpTC梫C?MJ!1X9| ['H :i(^apQ" L)3z5~2TNJhUqnP3r>Y6ADGbrz0 D6rb⧡cc>˜zw(,[?΋ gM+i7GqԝFIKH Ə_U]0 eRr* ) )+FȂ@]IF4e&T\v oBC!hƨ./(d:,B!|D֛g@ϲdҚ:\DQd! ӫӱNi+O@<P'A , ˰1~!%$YZ3r迄x7ȕ͹G3oy%a-{,UWqp paј!AxC &=Nb -&Bq=ɂ t̫V ,(FzJ0 ("U[jh=k( 1%z_ c03; ¤xZht6Xqh^m`a|'l6NDú`-d|o-\KV ?%rt*d v$oFwكP D.Pesl8dg^LDN c8ekZA Uhޯ\"5IL°9H "RC`'\$q:JR :R0vObSO5d)9&}(Dce -!&E ; u6 BE˴ˆ(&Vzo % DaV31j8-SLSMM扐 *Yvts3Ku.)% r1̑{9M^2=f#j }TK.NpЌ)!6r+4Ȇ&KcYfW:1HH'EvAYe,llڻ{:!a u@<f@Māv bN A/SP*zƥթ4LBx ӠJ!@;FFI7fOo*#%XNXY< ÅJ~KEOL/.ϛBWY,OR&3TrSb v(ZT TQ[%vI>'Xc՘ ]l ^{@#Bzv)]LbRmB%\ Z˅.EՒnA׀R"2G" +U'>&0Cw D%M@ r")5ZoXwH'Y geNj$$RYLe36HT 5i[y/(XPIn%,$S5y+6*A?2l<&4:hkwMoN?dy^9ײ2Fc nfV"ip(t!lLaF'|v9Y\u֘?k>Hr8%E. dL c_jj-'E| @9)1j=#o>"I'%\`JփX&R$CB6Hz DzE`͂zGiafP@>Ǭo7`E]Q("VR2P$@r J7yˈ*@=O(bY1@jJ tq'fƫuSEͽZڈr!\ S;SvV댙35~i&z`JQZ^Ld D 9k듵OH:E'Gj:?=(JS>(|) AP  IcTq@i#b  &`<02iaP/ZC iDF9LrTC%#pT{*.#GAX2T*JBU ׈ .`Ic6FJۥ>0DL0IJ):&f5ؤQ)gECtO]Xzꢀ+9b4rːʏ]KtE#B xR `j?mzBmy7-Rr\ ZUt9jM#\oq ю o7K?}ci5~q f㍻kJ^`XdmQ5}\.?mv_:ެ[ov2xAW_`G1ןn qzTX|@g/+\~^?*_,ڨcA|U_b-PԜ:|:r>A_uݨ3wѨvN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7FWa:{?Qg?Qg?@uB]FQGuγ: j[vN7VFLS2|1Q utN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7>gI3'U11h7ݨh!pvN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7Õ(O|wꯋ7?JP>a{}\kZ/Ղ;5 VW:Ǻ.j ].'|dv ;`lQ)xgV!1H ,3K̭@# n&`-ujk#!?|9\^,~,rr^Pg~ Xon}caqNL$$f؇1{Խ,O? E\bT0 6ˀmjmև0([3~SzGXAA4R 5H-Lj% XcȸL0k83VXM\ ^SvP#XBLJwmv |#dwڙ% XcUn&`-ԞsVAN#Xs,4yuϛ=ý;wwn?l糳xet`>b~rޖu9mwz!L*^D3ڱ~&`IcL flF%f#KsɳÒ#X{,$fV:{[⻃U7 X6~&`y LK!;c;X? X^nGXx 0^U^!Fs LO%f7%fKDw=o0{}s~G , XDs @LRPJ5AL@Ved}կ쇻`Aс,S u5~Y7^\5{u*\Ee|RCǜR.gt2-&l)!>WdhU[ ꖄ!Kֺe=?Rslg`Oo~g1${-*Q=/˓xӟ[{Lhڂ醽ȵ{? !{ƌ훿k;JE01MvA7_vl?ï~GO} .*Ukl8ҁnVgl /,/Fm2*J81^(2xp Fb7$ Ra&`0ꙀU= X}[{-\F X+ŁmJ{uRr{`em5dm_G(Ny2EEs7?aSVG[%ǗgPo[2I+=^*Es Y<a;v`޲gZ:}X`IzLI%f Vedo.g^sɳi 8H ́%`"L&Zw֊/>O"og93=qg{֛L]~p'&2"u;p sݺ6{/<苎t6'CB?vk6#=9W;)7ȹ7/*R7ז}!tZĩwuBvQ1Or9|<=])7'/>,g=Ge<=Z-/,Z2=l8!{7|Wde {دF]Sz_ۯ{8ẅ́G̔m^fA(^_M_$T՛=o7훻+#yDr[1,#u/=Ϳg|b^}E_!Nps41B|9joUZK4 Q VfkQ=9_=҈Nwo{DZY -o79B4Z絥dQMr, X{/b"%$'1VXM)VܷI͉bc~?рp[W2rҾqQ6B6Q{۩ Cqm wI$ Ւa@FKJfʓd"B>yj-J:$Q QkR"cK1L܆t-R$79(X,fL.IiwM) &e`Xh!`N"4=C,=XɌظ#/n'NYV2eQ}h9(+k/DDK3Rz\ZFNYWyORBp*sHqW,r~? Aؙ &3AOkTHvK"e]lipl>.Ŝu!qDKЫE^s ڸOH"ؤl( fU▕X`JHT5Gc$1ڌj \t>„z&&(a\Wn*bTSpsi_ 28TxPK-31!!X@-Xn QT.JITlgY  DebP V)/3*m3$"fx3ag$)\[|W"= ^m..9M`"_F Dcpm rrTK2<9f /y9xYBy&nDUPd _&L]s2hDGSU91*ޤB=d5B**p}5lsLNS.`+*'e")0#slù -< QI+w%A`KFd* `e9<AѻQUlu:U%*\!bb*2%CdP`40ᓊQko#:bvH@K(0eKE$ActNeH1ٕ高gj@o]?B0.@Ơ) y0jb8jY] !`gEhzx*xV6 xT:U[ctXY:;C7a0 ?%RTQ6%O L A4T66% ac5"ڏ*2d @PUv 45c@HqHqƂbK@+h?U {ve"R .-%u$E꽤lLbgVUQc%Gji"j8Vg`T1gvHtxeH W%ACk?fR3&i0tj_mb_L8&8dg찱ڂ|Q ch1J# I2Heά)G5 26VYѸq{ ȋ ":t@2Q#&@܊`mmK VgGESY_~(yYQ"m dNI$ `|{PsȓM("f-'G ]rpQj͸pC$*f7tdq/룫#D ^H@ e@ bB:&oC%|)0F1-xW3K:v hs^u@XK4iQipӁPE \Fm\g 0,1Z(&y*XRPƀ)E+Y[nT\pYT2N$Jh0]n1 =,i`J fo^5 BYUi-=XuX@KF039l@E}׌_Nr*0FCnu0qXq[atZNa]պilc!fy]\tB$0z`2o6:Sݶفf1h.֐繨)UFCon2LyNҭjx4G _;ZwäD,T@s1#%1CK6R<\ Ec˄Lԃ (0 DYZ# H@voܬGdXS46I#*dO "$U.hMg TfDspk 6PX?˙@̠:= h_0""A |T)k2zJ&`IPb.⌑TV̯;^@\0 懍9mFِCL*V/:O&b] 120+&)C`$1D%N 踜f LBF t^bkHxXD3[&^tŬί|> >k]woցCd8~uWӷ/_|1)v;8jJo5VBm]5XS/jT)щ%LMz9E?i.t`@W1x>a\IWS.3s|J:V8X}ْ0q'}N'g%R/j$ԁ_O3rd:zJP(ԡԶ uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP u JBʻ+?{P:}lB?P *4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4# Ai:0)UԁN<5`^+y8ք:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mh:|YCE[NReoLʇTJ^8s^>sel6.o'gSZ. c[1V{S1icOX 1ցhX}qX5 qq |벜/;>+;o[߸eDzF'׻W)rmY,OkJOJjz۝#)/I oct>]<{ͫ͸Ol9>C8_gɟd_,R?\ r7:n}s:*Uk*1N1ȻfӴ<>uIoFsVjq~V?)T>d0No|{x/O kم~u6ӯo:bjUO~:fAc~zIdf9?n!|MbYGg5 0(xHvn~HFd|"j79u7"rWR*46z'>[S]EtȴV+ '@<>&zgrO4ƽ`x)m >= +nb7v 5'=M{ ΣZ>W :j6Fx$Ե(:nj]#Ңw p{(!7w5C[BΛD1)Ge2)'G)3F.shyF4*c=y9N (+Ij&wJ*ᝏpO.|ܨj)BUߴLMy7-wOocE XQUU0:YBK?۷[]~їޫlX[?Ŝqsg^Φ\O֞{䑧H.b̝BvgU Ÿ6nsCz{^|ir\`=~mN>S-쒵SNi:g% /|ȷw'h^Js"5D %tC5,0{lƁOes$Gg5dm"Tzr qSYʝjPw~;/H>vQ PI*H#HgDäB 18z,ǁOeXWqkݮ"#}T/XhL 2jEіx'L;ڈgE)-9:{mDHaAN;1\iMcXbV(ؑ-c颐,1f!r0ZC [by'/>ɢܻw۶oʎ7!k<;s.{)@cqSy,#iIچ&#X$"ZJ V3nb;Tnj:R<+o;X^ l\隆О+*%)s2@}IrS΢~_|w,f867J0OjQl oBր*Dls5FAxa9G8,gݭn$ԳxYc7yuXۧX@N8_or} HNs}S9K* |*u#/]hvy[x`ύPDoICV7 /lƬ>͂s9@&EEd/${<4 y)N.#2 |*wF.7 0B,:r۔(0Ib(L-#Ԯ5aXIUF4 H vj'(А-ؼ>ц6$>D"?G^sl\wc[LLxΠ49ω1iҴYsv{lj >V/zWcnu@^X5%P%|Gr1VJ^ُ$-FO*-eEJL۳&.PӝLhrC"(.|@D%^+LSU)oFdhT\H5lf¸m#6?(&Mhwq[<IOQ ;^dGٯ[1+fWK{=lISqḩ<&Pb% &8NSm( |*[}7y/P -S<&צ+hZ:`EKB` EA53b5𓽋֚݋^,56Gda 9@=}'ZkvٟJ8.ƶBk-,4pHhZ]80S/EP`dJ[h~;;J0&ChD&PC aJ7Y'b~ht9W+DkV .صr%X8Dhc? O 93mhv̢AYw۠Yjb(¹S@+WX^.vTpm5ڔ)_"8/] Gċ8W݀ נ6~ٶuXA> |*]9)w|.+mxxq ؈~R"^k&2*~z |"MJwG\qSeI>=5d%%KM)UImg7&^;3G<\~8"Z>>x£[ [ 9l\I/Ols~Sv?^F&֠xZvY8Keh@U!p+PSY^Hg5*yxlVl>т%U QT`HlwӌY (6N~jZ[SuiniB;*C䌗pǻ0KO1 Ղoۻ5" Ӓ{ͨ\2$ if B &`P5Gl#r#QY4jNsuZWcerحT%~D+n Q=͖zeevشcv9?.Uyxb7ƄxPWKӂxLk~ I4?c4*.4_}xѪ'Qq0SPQbu{x>!na u\#Pq|.}ڍEƷa$By..G@KT],qOU~ރ:viirDqFwc>̵/T'!GXe'0ĢTwPעޤ[q|~m?̈وgFygoIh6Ae<zd5/5HyҀ!ֈ^4٩wzwʾnے=`G$$/Ij%edn;'($VE}>|Ň:ax=p5Ӳ4;ͻ- L'TܐZ쾻O%fq 5|~T5}Q5Wֵ1@Z+MBK6prrAW4`OdbiHr!*8(ZFY",'mpa{Ӑ]0G?g>\9`y?k9|fqhT<Q+5_kлSݛZ(㈱")L$ .<Ҧ#}\̗|nGFUШZ낳?>ʖsh 5/V3@35R,Q}>͚3Sz1S%6Bj]jw92o? J¾lh۸Ywdư \a zu0!*>@躙V6#PChT<-[U.>A1e v;e\L^ubFmFu\]'m i|6>ߡu񐞿x~5఩Fs  ?$m|+$4_#PQ.8~l|Wb{_Ѭac@L t؄9cQX_/ud\k@I7,O~y{cGTDPi$S`M4)ǣn/8ya(, Ӟ v91}!kCc<ޘE)crfPf+L]HkH& e3}Fߍd|J$p/%B VlnǸP=5_qVFȂE^=Ǽ,_am:l; HUJSV}w |* z9azXv`Fm?\0yKx;EVczt19[й }:Rsp[qQfԘϡ$g, \ FRFmz%%&\˂XY2UjQWk -ox$f1#q 0PŁד0*,4׹[\֜ͫÍf/+HKh=l &q5(DEd/0kσ!Mњ ׉ $}g4 |*ӵ JxAJ X_0Ns5{LAc#n6׊kpi 3rj숸9[mJ1@j_-GuFZA,)LjWWe'S0SjdXs=[)cMDn"-B̥#̈́qM6/=i]Km2 /x-%*~x CXV#GMM):lݥ󘁆>@jYnk p;B U$Y J!Ɔd&hT>_H8z |* vr7٫~SR5KW08wDU~oz{W-w-v~շN:($P:΅7n+o?9zuZg|b s+_~QCKH%?Ŏ<;!B(=^{(`/Ad/kUSMV`Yo̊xp"ʔOͻIB nPo]6U^?|m}mۦ;/<>;z]_U>2u)b/=9q14]D|zSo$*vQ}y>%^(`<|ƗH|75>ņ@`|!0j́W\;z _ L~ub: (M;705 ܿ#qȫۏgݞ|g" nB5ES,%b(v{Á;IlfK=`NM|JbJOa|7)n9y'@ǝ`v!pow5gϘyw돲αBԮk1!PK5)FͱL$4cDžG ,F~qhE]'&5IbFiLQ>FAqc0EE5ZQioQ1T4&Ш Op@swZӵWu s@4= OШJ5hLJxu טʭz]ð_.|1~f٫A$}4XIcf8N dQo֊:yn\4&Y w<ShE<=I4eXVyV^*v(gBc:sT7째n )dzcl0 #] , ͫP1^$&iJl:-ށcV\;],z71%^@Z(-@{-c=ՖjRx>Z/F1kN5^)g'ã7eme_r_f jdlX{3euǎ)b,bG?f/a/& ogU%41#VT^-{w8|58uotD~GKl7O\/lEevp}"`Y|{E:4rP`B~x}Lڙ-вpsqi^pG_hZ #NY9/"~~ Gh&@99F8o<:8 !V#p\ҠLm3_+({In?% :=zx)dl,G 8;,G9, KؤB0]b磌q\+;%!5FS8T)/ ȯ 9q~[xjMMwlS2b?IV$]!SA71-).zocoG RR|RԸ6q "5U|T n!(|@1Kg $gR06K-ȯcvv2(7 ge뻜k{ "L =-L)ݿ|]Ɔ[dÀ6߄gHՖF"w]]p2Qg&s˥ _,cK6;1XnE”_W'Tf:ToUA.g?ΗXӥ;o4(x=/{ηnv/gdHyQƨ :*7¡UOr^~E/of.d|镚w3ds7817s̋T8(| w$ ;܎ǔLur+He WPt4ڟRBgord&Ï<8vy47˰xb~NobH( pm./H N7O-Ti+2~\=<sK(5j@<#9/f,0-uQ$,IXl' R1$ D1p9\ݨ.L3탆0ʵ9^%FG2~_vn|8K8slP&2YiF큏sB^OE aЃ{ ۻyc:]\C?`SNjLïmMߵpBTƽ>.-= ES };@vS{Xjq֋aA/ovQ^$up(r;DO>[W<܋ /?,.oe8p¹$rXѥp6H1x p^6jt|et6:U.!]P!thv@.2Q> .|y\^_xl|]ӮB \^J.+2P( 9i5},pibnwMTE¹=Z:x}qх vG~#]F}x%!t& Pbƨ#6lct Gu4Vo \r:K˻n,ԮπesqKхCn/]p..4p8&CC}yMwd9ry=kc]8u  G\] t,}>4M~/ (5ϿAؼ1#^sl||U"7H7Ef!ruI,Xъ EyNX\E}Ɩ9rIsAM7UI7# =4n` qpw 'ŮgeY\sߝ?uY8.udw9[Qm^#KdžY xJ1pn'|xyY XD=zt,B[Փ V^cIl '>8nr \h9RfwzX<ܳ0ԛL`ʯT6_!@cB<~vK@A嶿⭊d}'qo)=I 9Vq ԫ!`!Oa34b⨤\ߒp (maXaћTcq6}4ɛ"O-Eu'LK:fR%ΑpqR WV^q b`GIz3qO[D$Ns"g7h@ҫ3_{(d{4zf>oA.Y*5b#{pz2 U ĕ(2" k 9-Ydy *7-B"Z4 53aC}tEa7xg2!gzm <8!g 7^fu P !njl;32 aL||b"ƙ;pVFvs ˜e}jkX'7*02GܡWMտx?Bܧ$gBTxl`(½oG:3  mh?fz>c7 Jo1OT[ȉI r[b7 XFmb|NSIWSgLI S&?u5S\[Y2KݎqxD<ǣd2a.(1P{IqJK*qƆET@99Fן6*+.k@ E4&ѨJsf2w ;:6"'D Bgyzasm!f=\^=I`֤ۦ$lxVv16 IB qE MlyN0qrq Bo`VԵg@1F yP9O{pcIӘUUJj\N dƯ!AB\)DKc@cJ'BFory!LT5}kj5 ~GNk\!33 kD¤PHHEE]_~E!1 I{j u*'[^I;%18Yt^ poζttb[a&C~|,Z̊:=ILξԕL1Bonma.m[dTmhM ^b1 p~Lj[f" anP/JSG&!Ӈ q^(LuyY], z,Kꉯ2>~nA :_s)R>=knv}^2y )LFbweF :M*T-mENﱆMSb wI:|F#bXf#_Xf)5nW3ITo WOaLЄMܩqc^,HFAp,Nc ̈XTjΉц .F.%"7C#B|Lt6Ӕ0D*yO0~^%;/-mr`5[Md&]k?Őr<=evdBdRnT ' [ :X`;Y "d#.HB5NE]oaf C`E=>Uӫ#OO'+5廧19 :rL։cLMC M9((_tA$ i進:y iLqWƷV6:^aeZ4AA(Ǘ}uF(uҜӘ8v1?Q?|PaY9y1+v\ۙ_w/ J_W8nQֶ|Çnv9/vq1>i/†]D0t\Nb0ajq0 I[+UɋwOcHʌYiĨʄS Ѹ gEǻ/#<[dG]%@"iLє?cP_E*ӘD=s  8pUTP9u )sOg/Q݌*;ͅf|#Xx ;-OFS0CM BXr1FA(fU;Wێ_F䟒hϮ oYo:<8pOy݆;y۔sNBj+Zo]ιc?:۶Dw_7rxq;!.1 kS3$d>v_ D%1/倷P]1qVDy?lYe^Ȁ)B3w*JvV5dNvk Xc C$år'KgUDL>A0&Uh~m!9W^㯜Z:|?#+\J&AJ3)2Hd}EePu xq9s-xe a,ۜ6 c~݉=o/c}ie?wYlBMKz#9}]HYAyCj~nQ1Bѫim/"Ў{U)˪aIρW$)B0kGGb&Izcq{==E]>f'ES/&5O}tƚ郛Id|~]cU~K(M ~7qy_&,Xvє欼"Osų/c DZ`f*S͓=)^S"H-̸i3* WFU}J;|=8ٶP>` _nr݉8v:[,~HYwxD㻛77_v6ڨ\F77>:x81{|:.m}_oN- >MVCKxzتKιO&vuP9~ͪ4O%wUOp@F N}(XYo3E9&qzP ٻq8] u+oY: 6.Ɏ$4cW/eŷoj6۷kVY~sy 4$H{&ɂ}$r zM`>#DP#b;RT*8t_o[ GW3mWsg 1Ղhb=cRp("%AZW1G9Xs݄ ~ZL^Y[WO;SJ=.֍IG 8.d~bif9UѦ!8``Nre*Kq$#[O904[,r`Pi̮BJfk8 ͓ [)Ho]ߑՆS\R(Z0U KI*m W^:w ӇwZjZʀѪrjUe UQ))wsHK1]Qj,"E>N kMO]½69L޷.(Ų6b/ b]/H+Uu1 *Y7. 4$Di9pL)ύ j0Nˀd nP/cJ0-!6pgsqmʅԆN\e >z_-aFJ(X+/~uVۇrvA>3źe]|X8/ DG?x)Rf͑?WwG竻Kg)/) _|e"E C"2f@|*1\ }9LJ'vV?9w:؝QcU.y+ϑ [q<1 ޅ zY[Bhg59𓔂zL،(`zj1ٕxMNJ2L Un>VpIA@ȁS n%B\sXqmx⯬=RKkjRt.1E-<[+ 1s |{빙 m˹ߌ(lǕ(ط.ζV|uپx/cߞത19УF^id (f%()t1Z|YdiaeQ,(3oC0R!0+٢F E<9#'!Z'+C;[F9 Ѣ'cpv9z\PJ:@*9JZD _y/z?K/=E$Ξ`dj >Dw:,^9@+/I3/n u WthJžsVI*~?WcLR6Muu # ϊZ+?p(ʪ|u(wݮán kqT@B.0b#jD>l=r5zV &-gpZzbd[ '] JV"MCT$x XFY;opΌ9&WF$|O;NݑIAKeEۘ(ר L}Fbəw$sXtnSyE8K+[Z9n >$xBUy==0|-yf@H9޾X 뷿|/ rwZ#1dݠ޷v- Vk!,(妘"amn=k+w:fwW̾ d_LԎȫO'm^uv>K }_}w D~6\b>_ MJL@x5x4W("}E"YMdv2µ)XYYط(VL RVamU1!6 ȁI| 2_YKW 5OK <2J\BFp`y8e,Yrfl-,,oհKVI\2藕 En?/綎8L3"aBj`L$ٴ OoucpC\R~-^xJG>^;͉;!$B\B!Kߚەu XG~ÑvxinAKo{1-pZq=m?\jNhfͦu6@~)lgh'4{8)$l[CEERL^MZiXH'"b|12F^0|?rhlǥ%r]+WJ/kڶ8ȕ"qčȑdjUv%d'еNΊ)F:սA^=-*}/Ky\1UA1-!Lt;:xr_W ˿6gTaTwzx^Gun#N&wD5A-iۡutxj<.>;f)tB;ztRlkl辑;̞#eڞgt%N]ޑ@I o:`Kd:xLmI *Tbp吆1< dPnK ie!P]TRĞEwAӽi-H Qcޘ.j c9ۇe `xnTa/lCz4Dz3zlk3 Y$qjE@#jʅ g,0֛]3ZA,.:>z_[ӁOfȡuD9x8<\ȼ+ng=#anA -AS`@xt8a3eTv/x/r$~lvJU dv1[ĜSZy5qϘ b帆 Hgt>-=#pFlKFkWhd{FНi=mpZW`DieI! IAQ=c]}\|Xs,=#|> vz/fčc=oIm4* M+>kw(3B jK*ɭmPd3"F\Ƅp<LE_(@qY]y Zѹttgc|հ:X4;0 KhlMz3r4-2Q`R`##F (({xH\eoWlK>_h*lQ,pKS<;ͺ)٩f]?83[5۝gp:T*V*mS2̣'lR$0\6YA̩%efKEqZľQ@wtVȲbfG%%8] EJ|X MǬb^M0zAiˉo8gpl`wWI»RFN s``wBwb`^vA}?[\3LPTQc"Td'#H#:Y\n$,(x58\ Opɥ *ΈRcE zEr;{k9)k~N5XT`XrHekT*X{7{8?<gTuzݳεY`eڝgtPl9{G>3-,>(l-6H#A<킬2=9>9Y~0E : !%_dA,IHE%44"A G@,RV[5O6tV](Y Qit`7h* 2YZrrph9Ei e*5 .yʕY) 3Sg(aS΋E 7*?HhZy 6y(Wޠ0zFCHYm[JJkZ]fƛy{FD:HQcH$‡ R*K,MZuZp|#hQ?ǺzFG*>Uhx0zFN>Yy&Krߙ_0{(d>MOKV0zFN[^9/%ZVU؅3DJ"(j>Y(crFvKN]˵ C׻652km_0. ~&nӠqmŐںIl'E=CJ2mK%dA fΜ9ߙ9sfI3cv:2?$xB,CY2>FR# =g1Xׅ1s0k4gp#Q!X4f]ąfX N09}}YlPƄ}(;*ʍ CHYZs%{Q֮V_hZncp+v4|'{ީz.3cwyab p q3VvG8X 1Ą /v#Gàe G{ogї5ҵ(T AB@ֺ41D1sc lRiǙ8\- ;_e67tfW'8gb1 H2h}AB9k,K69\BBE[\;>:f~ z! 4yasEWiuhFS$,hXٳa^d_7r-7>M\"-S*юO\Ir-@px5D JjR͠/54զ#n[ bgPj&eW٣GT|/Y!>yrӎlhsFp1,8y7= sB`x#tڻxF."UES^؟_{o+%ǧ`ޤWɔ6;2ۥN%/w9l6ߊ;#XF%yU.Mm,_qn=WD pW6%:((ȕjߡP{(,HUIX-b?bSo:G[,Y@1$ Ufk{%qE_Y 5D`˖O]4h ]KS}4H :F>n@Gim($IIwfm"ב&amƚtECsmv QeY獴s(4*'hkS$anogErE&pI5*OZsX"5sCZm9wlK)Lt2edQ}m"k 4p-֣hBf6Ƶ"\n3n=Y(YBnk 6Mhsr \95Av<>0WμdJ/TMgy!H'n\ %ժW.*5ne<%mSd rs/ԤP\s QEyrM]n-7\ONJtbceH=EKVkk-c>yn)6w%֊J 5!ȥsQ+ľ'8*U9&bDr9 rL\9j6ۛ+GedYoDHvHUv;( Ԗr*v(Qi@B7WO\SώE=~:#}'QZTb\5զS!̕#X1W\%\abn+w@sJE$`̕#rrQ)hoTk\9?sȕP̕jaA%Chߎ j7 %,`w"a KB"Q!D -%ȄbF i#l+*ӫ/J-R6QWѐ@h,Vjfі,OdpnjW3#m<KC,u1 CͬV+Aa)@ri0 qVIǼbdN_\υ:CBG|,ya\}P> aVxϗgMK~^4y=;_ʷww!Jp3;%Р h+e>׀hϯ辏7sӒs?J>pm.M};sG{Wܛ$s%Dt+Wc3Ȼܔ-1SH~1QИ>'K ɭwm%;F>̴{!\ XC8D,F"lpL+@H9QXH`>wo?p3户7*]TD0Tߍ0[Q>N&<= )9Ȼ:UM* {-r%{ L.|I_9|J$UR~UjYY8WLFoYJ%ԝ>+M\~Dy9+Szl0ٛSL2.e!1w{ӼKO.MFp )Թc2Sd_I/On]-H|&(q}ZbFU!Ph0K hb+$x!:umhʬ΋bNN0ၖ@u!W:)^a`t#WNK_\ةa$R&{|4]!:\HP܄Tsa,4 qR*IHH!IpD8w!<8h?3xV7=vEU7>sv,lKX8l1S\֮n-"k#f ! 2,YLEELE8rg"HJJd 11 0g6 ) &ٶl&`!1oH&!|eXfx'LJ2wEՆ|kT/x[eOE>c,v1IYdR_yX{S8PLsIKMŚd6ypkӎ`H:leY WOlv#s-/V㘚Ϻc=3nX6Z,獱̍`y^,x$Dݧ•M s;_<.GZ<^{׵F`]q{䂛_Z 9 _;۷@hœm*׃ vtPXbV6< B($0hXdAUL#4‚SqCv}{(hъ]r+neqyb2o.[75e[y5GbJi=~?O/ԕzi~Zi{[09;N5GzP䝃^FȆy_m]#M ,(u11}}|N<`-#*sc#WC) JHEbcX)ּ/>/ >/ bd*$W #*x__輸U=R)Dˎ/;ޗˎe}zտ}]g<텤"#UO_J)Wۊ.}Z}->( A(kDLֺ#CZwrYJg.& U@T[^\5ADV[1Vx\"͢4> !,fU(FS2}b߃ x0f1xwET?;{1*BޖL# TР$ݛA_xkh{!OGFyYO@_gJQdas+hJoy4uiG67v$qE8! f>.^}1=TP]4թ["űҿ[z| Mz̐GMj RdHgG[6soE zȹȁepB9jХY>W5fLc3:P7mjWh?%`:jC1GsbaNP` ->uφIj1>f`sը,X[ft4K*q)f|䌦_o|4:g2F'B αn3Wu6WzZ1"d[зMd͇S9 &ˉJRґgzq_1iwJ$ԥy ; tMv\~U>.}$C}")py8Mxj)KW.b{8|fqc\/˷Gc }/Y}k'=o[uj#w7'O6]o[;W"qz W͹n51ҭswc>/1ík~ 7Y'_ij{#Gzl=1$Y5"=vgxrVz<ћc 1791yV&JFh ;ojj*X=WqŞT<7%1׉Rh67ZK}v|T+ N1H7=8D㷙ՆC^f#un&g=HIi9vlg4_oM~mG\LyI;nʴF}ݢev{Zt8OKpU`^ X`dɨpv%N+L@,BL·ܹ3DSeQ${Ֆ-OA LDƉ19PGHnNQc{% -4{#x d]TUD e817ၒA5$pQ@bM#$z$jWl0ABc MnHlT"@E^d\X` ={ҝr?ZXd W:D%(ĬUX x ]{Uv%$s50]E+Ji__#$t >Dl$>F2~.=^C5"7 ON=J21zW{=jn7mTUh`Ƚ# xo{jY/xC%쬄N1zЇ˟?\u\Fs.5} M!AKޕ)@'M1xϘ Zaՙ#]I{אַyΠɠ?W<]0dՇjUCj~i}jIcyN%[oSn*Qc<- OXDۀ=#K{l>iTY 7*=C%t$v_*G`$T֡UT=u֠!z7VNcȂ15J0WA]I w#$o ,ȋ8rFJ\TNqH)]\/{|h><ͼgg0uh\((G({偪j8 G;] Qp(H'HO_Mf/+ ^;xECH}3A"_+rfmmBp =s(| \%/XuB&5`c$tE$CF[hʨԺ?~ GGk §v%p@f}-lVcdXZ*8xo~3R`r"JᕱyCc$tRL$ORQFD+jX GH4IA@I-SX_Bh8FBVJ<섧)lȳ=섧i{9Zf',̦&5gk uJ{Mak'CM]7remI<6-fr} -֧KWU4jꂙ ]5mѝ:]5-Ttv Vvf2tTS-SHWd;!j ;5j\jWe =z7HW)'DW0n{tRM]5- ҕ ݄)즓joz ]^<]5-o#]9g$L좩']5mUtݧG9!j iBWm]O4Wo6NWOR LWOSiz/EWOҒts> 2O+3s_d&EW%?jZ5jzutZޝf<ۡ+;!bTUe-UR[6aO9jo].g__qߎve F?k˺ۮ)\zx`1+Poȁb !GRWGNWZiA /rSч4?I\N"D4;*V+͍(6;װ44/~i#{AK E ZK`#p/I_GS/#jEVqᶛC߯>>_x#zFE."]D1ߙr$>kkzxFY_ZGvT}khBͼ=1a!t'"5<# Gxܨ󫉍@Ξ֑<-OG v8/7^* YIӞQ`4:x]T rDQX&EE-n^а&p4_+Ȼ4a 9bREG1^ӵ*ªC/!k=WfNzޓZ?95W?nv#v;R*5a7)5au2S)5i"zI dP `ɈRD "&W= %|Re*0tmK"_.>:p9?A=;3_'߱"@iNw@)}(R+M͋U[V޵ҴUdYbp5X)L*!UƲfuSg> Ry^NHKˁAv@%?/\ڡ4^㑊Kos$G}v=;Auy0)A:ا( [&cꭼ*=#V^~*S"?eV2~`똍WRk+;==.=/umٺ[!1|hv]E !Ba9j38{z#Hq'O^t7۠Z )sD=r 塄b~}wT)hu0 b1x(P]9;.(ݿ}:|`5VV h3%m5 !cmZkbEʶT`56M&~j䑬TzyW2sMϫ1(=*=OCkd+{+wU4ގ+=c}`P^zH{ k|gE\/I|{-#U2rP]0Fr(r9gba2[Ǫy9д9, i}^h\pj!\}?u3Lpgq+~u ]צ- y}`51*5Q7dG&BGK_n+!}n5Wg1 _+|`n yRWPqPvNCH{n!$U5C85w#ߊ=aϵW9EC.5Rnu!_X]ܻ4Zu~\1[R CM-SF:R [_[,E 3.JO8dz}V˦ ~Ûbl60ł''1y~t9[OBXT U@SP.㡐ՊR$ؐ `U0BO:.p`D U>|>2MoDHւ)b zb\Փ+OR S[6fjT*Cf+E(BW"Rp 'B̄+T+dq*K`Ӏp- 'B+T+uq*U.JHCtNBJ+IQm}2*5=:+(!]`k4vT[?Ԏ*E.J)nTH \B=G UZ㪃 ~t@`pry0CV~UcWĕф(@0=B`ƮP-oUqe҄xr1f[EEpaT\ (Xi~)pcdnl{+NGK.BA&km( v P%3Nf4Z+9` 'iݛ7gQ:sQ_KDr'AQAdv^dҿh2@xnޖS7T~H7E!F\O>5}҇7ZIOFC=e5o{kzj(e2 \QOa Zv\Q] ᒄ]`E5$\ZEXq*)qA\qx Vn%XH PR v\Jkz\uWX&C, `AT+yq*U vWRՂ%ej $\rM(BLWXJY[k-F`pre0 խP=:+m* \`4 CeBO+c0!`prW;P+TI+I^!!>'z>"PS퉖CRk"$qUjS  z}њrW_ U2㪃bD +kY(Ѷ U 㪃H> `+TTq*qA\ * ){Ԕi(BW҈WĕZ W `+T+[+TYJ1BhH 6]\̝APKO4JwWR3qf%8_RZpb)0Ojkȥ]o>Mo4(сX#ސX;,x yW<zi2_L 9U؋|g&Se1 KZNq.7hoCSvw~PS?F, L.[yU{^G?U#D~gQp #(h 2kݫZ1X+o Dy]1Dh!7M}.o#31+oz3<^J*{met٥VV;Et=-?[wU]-&~~4."Jg{]QՍ[|h&5I{M3f(ӌ P&CAeys>N(6b؏|V7 \@ ?//g.(ݿ}:`5VK!V !V[SԳ,=Wլ_wwĹ^yU߳G%KrGsn5mefHl~s*oǕ:ePUk츈euDVSde\sOZV۠Fj+8XW\n36{=LXE4z\t.d$n xž2X TStA>LSh\pj!.vC0N8]ƍz}{צez/͏ZEyH(֯7j<} {ӼX>ͻKbꍍ6Ѭ?Otv?Q jR3|率>7q&t5OAJ[?sEi> #ך-|bK|}ϖf64wojߡ5|!C p@|Ww{ȭUD&V~)FGV8Ӕ#sjx` ?fh٫)gWjٕb)| \5|޿J7[wA^:vMO$}o(0M) hr 6<r cT[?U.N"Ȁp9  PbmԪUq%-ؠ`ujʵ Պ'RUq% `<\҃pjl;P裫.J # W(X+3jm JU=;* W(X3vrWv\J{\uWFrx@ P &Dެp\e=+]`™&rE0 Bz>bMφDZO?惘`E=^VWTږ9jTqAB ,\\B=" U 㪃bJIV+kM(TkS)XCc, \\Oe ZJڎ+Tdb ]\%B=~T>"$U<+lÉ@]Zێ+T{\uWoC9v%\/hv=MA;x)Ǽ>./ok>u_>|X:u𹷗 |PAGg/{+;|xT:ێ_/wy9IŗzJ !WW~~`Yj룯L!稐_M]2Zn:peÙƘⴛ/l8dMz}Mu}ͻ n}kORkpGEi"wT5Ɔ^u9Vs(vRU>^1_{T˖vN 5=f3N\5ϳa̕=x]AAˌC|ynq1g 7d:C2?9&Elcd&q3i2Oᩓ4e1U}Ğ\DLGUB1\蛏XJbr}{n3~r ^5(q꿬 XhBJ&5p3NrjÕU,#8VZFM5RIcmYbA$P%ۘdR8NYrq80"$U.`;jU6C!%%+{?נZd\%i*H A,؜YjSYV1JT۪VXC 3B5V{eP} Vq1,SH(RYڪpkȸWBmeR&i̸ҙ$VJY&=HII ɬ6 ]+. Иc7CDY',":AE"͒9twM~b|4ƌTC`Cfk\p8xK;sqhUCVĉCCܥYftL;YɄ1>C0Pqw/ HQ0"-bOh_~1thɡYX<JIeXDyaWVV [UB5q&K]|9+IM$֐L)\fi jھdԚ&yX*3]g8<x+gq֡g;kƠsⶩZJlR,QqK IA$'&Cl"blbΥWgjTjRKRU,'(o_Lxp#H ԡ`$#,Fe\6ZsX+"\U,*њf #owB C4~?{Ƒe a]`ڪC1;; &8iIH9\Jh˔-GTխsϭ iF* h-͡]K]qL4 I&jeW+qA[4UFݚ 84in8o'  u/fĥ"ҊY7ICQŘ@QE!i$% !pB6}fC|ugⅡ^Ѓ׏Q Zp v`c{{Y]`!00 ƛAFݦtdUt4CҕjIJ2Ba1%OpHv9ok=/S(39՜ HDNW2^i C.nk?=߯,ȼ ȓd}E M2f-B&ȈG!A]Jr i"!%:]%@_>1&#TϺێSQAᡔiD*ZuIBv 䁀:赈 )-fm` N5MH#{Nw=/ (DhQ:fƮv6 SHQ-fWbd@BUq@ơ"2ΪU%Cʰ"  De#@ A1*'JzܑϽtҝ,I£j4ZIP*HxnQڀJMKKުhQExݷon$BG 4/^q6,a* 0dkR@v㠝5ns4貘 XxNet>kkuT$K n=CnNN44z63 Z';-,F ݚk)DzI(ΓDCkBm1&&rh;' <%2`Crh[SP.O(7"fh8(D+YLw]zʠUbF =>A( =`-=lYaKl+TO˅+ IISL1. Z`w|AW F-LtB)* jf,(*Fb1;L =yT%X:W (]IڈJ5(Z54)hc37 ]HkY 6)jP%HI21KCR 9Ok{i<5kW!*Z>@MtVkA֫ѫdmQi  w:jNàe+J V'zʤ躐X]?Д n F[k|T(wZ{*(=uGm(6tqHUh zu;TS.\Y-H1ˡ옅jIzrPdpN۩ %uR@Ւ0[ Y\ yۨO]gF358F R G^Yr+v\YUuzqx5DP g(Pl9{O?m|`xh m޽{d<8٥`7rq r<ڤ])IOL f`@708|?_,qlq|t$*##D󯴭p~|&ł<]lu; xƊxjf{s6E{&hգ߶u:ΖƏm癩w  ̏+ylpU1-==, DOm!{"x; Ђ@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nu97&'WxΜ@;gYJ:H'P@sb'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vbD  j4N Ua@@i,; Zb'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vAT{9"'?'uqߝ@2(vh#; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@)ɭ^:yӌRW @Kkuq~RL ~]c̓KxK b<%LzK@iiFDW, ]\/BW6}+D]tutގU_㉖+GCW-PztutnLw`eCW hNWRC+'Cɧ)a,ҬU is@u޸1Id>CϞ,9_/iz1/Ӳ8~r^FU<1Q=zcKLigCSAONw5Gr}Y! g2YᯫuZ齊?x` gOh] Z<SySE*x"f#0_.g6ӋqxQ_ Y-=ۘfz+KCWM F:6C A%"4ț>x0_09ףl{3 4}0E!1t٨[xGVݍKW+Jn'7FȔ"3guv8Wp~9/hO7{-*ȉlDбHA{kوp+(.@D|~67<\ΤgOtկgl IDwIts(Crua=Wfl>-iqBc|3af fr}T>*owo:_@U]EEl*Bж[c7:ѧ?3Saɂ}U{2>*Gr?aϮ{h^ҡ(]p|KXt%0] ])gu#+6Bc+B4J;/`˽KmFCW׎mNW2FC+ ޫ?su ˨eTE2*G=eTkX2'u{9 90sZWu~<7Bgj;";]JW}wA u BVI]`FCWw< bPr1xt2qL DW׌f u{PxZ9uEhޯ]J[sS^y쵫q~hzʰgw1+tC/TVX ="ҎNWrv,3]}RQ=(8zBW62e`:DљGDWl΀] j,tN}+Bp \ԕBH7\Tc+UjA0Z0b +d ]+t"2] ]y#+O\+FCWV|e*H֮8]~?A;]h1i*J#FDW׌f>ܷ֮]LW_^'ܫ]n؞{5K1ϼ'ժaA~>rAVr<{puZ=mnJ*Б4[o xqVӲ]}:+mm8^AϏ>|a!⻏bo݁,-y?ϡ Oꕋi]Ah>C/b0#[HE6Ͻ {y# WߝcTֲC>Ew}uy?߿Y, LkuGVpWu6 _7gz/__zq~^"Ԯ^Z\5rqDvqsW/f3 W_`ԫ`|wU{3^ ߧ8uk ŀp^jv8_L_|fH=?Z8R_;_~gX.wfSЭNTZ*zn>0_tȯywu\*ٞߑі۱x+9,ogK41^ɼySa|2Yqׇbbkz}u(ׇ8|hN˳K*TGGoWn3)t ^i;i8"o49joΦh-Ror-\tV;];0(JQ-?L}2JC*XV+_.坶/fuǓ_>HoIq(>ҁ6zie*ٻ6rdWm,̇L;y`l` %e1[l˶-Ymu8ÞsGDW²:.Uag%r8ωgr#uʢ,9K9@:$l %&!9T <4o.!+kLB$z˘0x%t.op'rtD¡}I޸a-7i|gB"Ɣ&Ϟv񖞼U6{xE!W)$+_.=;Oƾ*~Pn,W3NM99.I"٬u>:&UNY͆`MƃQ ۙ6ς;;Dˬf$C-kcT،RZk64h rvXv€m8Jgmd &3CP2'Ygpִ֮ ̬uT H V,E1I%SF#s:(_PAɲ|0.Mc?-K"|P"BFE*g(mLHJD1Aju'd?x/eAb`iք#Y<{<yftydԊ䢡MN'5Inрt4KÌdqtBt~q%m}(Uĭ Pzœ{'mŊǞ~ZTO)P#D`9Xe$ †p!&yHh=0rݐE et9os,ʮyWpVyBfo'gW4˭' |7 @>}R!ETF0kds,y`BP(M@͑$—:}ЄŜ$5skUY(qiL rdF+Uh%H !iI MPqߧiLnW&Xo:?*o 4谔K d9ŒI wϕ5%&>fڥQ)D\2k^D M.}[0.|s?ru>z~<O0ntǝ,yY7%8;PS 8);xoj |Nh Mlv/$LNw]bV{e3k*d{~\E?/SRɢfmy/o=?.h8!$HHntw.o/aF ryNj,{`s }Odf٥@uB7nz]4N2$ʒzUb]YrW0(Re$M'_*z~Sd|i\Y*KzUzW ޘȼOWR V}1;kԾ{Q>].bPT=TSaQSwo~|[~ѿǣzѯ?~O_S(;Evo~^h(};-w둦PB|ia<3մ`;R&anEQ !nr0w<#4tndChfD -pv$ 'VŅ-מʽG]G%^oוfHۇ2y=4 ?Z51,=~&/=."YPX`s6;;!۩3%bgYp%)ǜ4f"E Ctգ#WLeBiZ=W#I$4`QIQ~3n]FR1A))K9kk@ Q9Rfp DF"g1xPa],+u6,3 "bཡ`!dq38OǛP|ff+=͜z%,ȹsW>#ßȇAFp&>Og;A)+)]mildr\9CKVA gi,m,{T r9{Hx+=, AA$2v*Y4)c9]U"Xn1dTr2xdA@vе "8l1}ghO?> RWDW'Np-&vU z#]YQDWزzU˫Id(hECCWzͦLk+ص Ъ9=] Jtܦ(*+w>3\UAtUPj LWDW+"UAٷ^yV&J0Ut-tUТ;]J@W{HWRiʶtE%$j׉  J18HWJYՃk%T Π*ʁ̈/w"ڨZ"^]J4 t?teh45%2` ʾD>""BjjʗdnOW|)|<).X죹F?|ilpr4w]LU喪-#FiZy`ҹ͇lOh!|ˈ^e!Iz۫РR m7۵y+8Y{z{tT[ mtƧ+Xһ0ַ4o!o.榜,r ٪e7Zv}\IrI\Ffm2(A7 je3K*ڽnd{j,6(Dp*YNm1^uaֶMx5M|7C K GSGclzs r魍uxC;^Z# Yl@Wf6=GzFSU+T-tUjw*(ttDWX|NnEWw4utE5R+"RUզMC[++ia= *hOWTVDWgX ]J J=8HW%Vk,w~fp潧r]%]!cBEtE'vUjABX J,Lj'" q q.D5$2|=tk6=0}j#7k8}#ZVF(lK/ 7+M P]`d*p Z-NW%LEtEjuVfhwg3CtUtEG] ktE(遮$XEtUU ]|+"]tt@Y)*+z Z{uUP!ԾtifP]}uU|P*jʐf"* \ffJ{gS2ϨN:; ٴ8VY3zH .gg6mk eh&:mfڽ:;sk@YTEɉRi[)E[ԱHxCfAHDB~>Do QB2*%L0W2¢xՋWxM^{埇3Pum,71v&dp. -HjG~ "ُ8J5Nd[orAVL<8WNٍ`~،X +@\yp(P~Sm5kfI``cT| sxE箞6aMp,ղW;KC0{Ҫ:7&'oGkӤ;iK֚+cX?7`4{Zz֎_(Aik 6)WfmI huW̿mܔUF{tnrw I]m8aHgr  85boe}|]uFyg#]2/z_xS[ID\Krה њ/ySm*S8nCi_ZZ{N|[ӳwOAo~ՎrAѫ_dTU5qU;ZDZfǕu?2wfb|s[/R׭񃦠GY+m'zQJ%+C1)Y=j|5K6fw\`Mh Ra2nk1aݗpf5Jrlg# ,BM>!~պOIWkÕOMtm)]>=]TMj^vGWL=V=6ucL;CW6Z*J&vt銨Ct%םWʮUh*$lGW/RSl XwJ^"]1 5]ZtƬ+t>rGW/fft\ܙU@KU@IՎ^ ] -5ғXЕ U@KU@Y;s~GW/$KN (s91_<ڟ|}n/|gFYy`anXAj0~] Pk^㨱Oކ)= Lt`橋q~>ˣ_zWތ!3$kր|Xu6=? L'CTֶcWxrn@%^%9UYs Y\J\dzq1ŗ؎<pdjhfd&l_GGG@ri֑rbQ'TXQM(:jiPT~//e'>0|y}8 ^o]_@Lw~xKJ8/EՇ'a5AwHn-r5]o&S@[TJ}\JX)wQkk"cM-2$J0DyT#Ĕ0 9͢?2䣺b1bPpῊ3CfABZOO)VP `0(rHXI'"K5>U545XƩןX.YAs) FUW!8Wf_z;WVȫde ٢C׺e3+B@9_n7"XVa64*Nc !'gy)*{؛_:P'䲷@2C-*/Pd2*@d1=ᾖCB t"T4Y$K3 яpKe俌2M=aVa$*-ˢ7;& wyTLG,.M(;]l<*'ʼn.6^XS.kdQ-ŧͣl{<$r1Jt  4eLb{,*X=<-Z5LJJưN%Mzx ׏i%X 3BRkZ6ZsA8t<(݆KTe=5׾!4Շ[W=ryEaQ\Tt<.(ᠧVJgw=r.fژXS-5,1x"n\G@ӕs69ROgҏ~)]g~ۍ V޹&[P]oPThB DG#b(r8 8gN'v "+¥{LrI8Fi Q IScmڂM?w:*?_1-j4SX|׊o'a+HƹvXb=2k$UiJR-"ī(NZuۏ`@5Y L oX3="S*%Rx"%gRɬL̰TiƭVc  ( L{5p+@d;eMDKeAaA[ɔBhm5,/*:XV2CqtGQU<<eΏ卷A8F!\%s%YF(A< D><=83 ChEK:tiK/o|U3du3ILK!ü(0a%χ@Wmf UnuU+R]Ht >?c^iB/T] ߪʲau; o:I 3L_uje20HK{x424k}G$NY::XSK 䴠Hm@1L)B:`bHd6H'g+ϼ["*2VZ\vu LVAu4|,{b>X}]*qYQaVbyk.PV#(}+2Qe3_ kk/|W77T -:U~jtB*1 Au1yl3dFi^ј?UrN DbL.a,E8p؂)DKsNJz9JK9ą`DhIԛ@So),*XLs+[71 eǴ GέR$]bG,J)2GI2ڲ5qеxN6ѨvH{fig>_?dlus4K|iEf:6rW) l,}og Kt),`$24:@@8Z<9Xh0jXDOp eA "!2"S5(Rˀ3*&bK%wFĸAPkA&N'~;ǯ5ŶƯ(qLĘ 6΋~@fF?}m|C\273$I',F440((ciAC1}.-/GS3d,@` j6P솓Bsri4i2^}gARTGrc{<+{;# [8f(?I_?~~ 6mr9ʊ(IV/#J{#rܓƋ^]fs[ݗ1O 49"lph=g+l=j$~e:9Ogn`g%ZHi$::$iТe"\.Φ#xdl-$7_,!ՙ,#+Z:6|U!t]tI>f+[<=qqFl20Rsd2 xoJU ((HJb XZbMSi[uLw2|aqP=0鋅S0(-ˢtȞ=y9x8BHp%#1^6g1,EH>|gOƫʍ>j\ÛM7]k FϣѴgOʇ퀴.4QX=TRTR򫯤, JJT3bF;ŠNůb :}65b A@DF[b2t2$IX k.PX$`4 j>$p+ИδVE?PvVX?6+|ޏS_,aAs崕h~`b\Rh*# mbЏqEdG3>z5)`2Di??^v|[VhQ"Cřq8E6w/ mݗA&pc3ߜ7I>0I*YJn9%]Qt2ܨ'iSmʁiri}qKz*R頊ݪbmݕV6I{B"Z&Lj`ä8R0~am$3Mc~_ɇO|4Y\7yVy&t+ۯ1ș\v$)J]x>ɼiKN]n6wĀ{$xP+nMWZ?qŏhIw07{d_|2Jx?;2%p>D~.IL?s+NXnv}6":6φ}u#dFɗ$]g^մơz ."};6r素|' L dx!̐NfH'3 dzJN %?q$ hH'3 dt2C:iuڸ09sϟ6.%"QHI$SwU mKR`m+kr5g4:bq]Zen6׋h~M(6hh2ڢ1Ee"bGIYTK˔+"^JԤtj ׌R|p4Zp2ZIe˵j6-pf1f8E1\Vhgj;XE}uM,̳MOSYB7GE7XA%[x1vwZ3EPYXC˶x2ύ@Evg,yaA/Gli5}e=`-@ BHC[GGөl/kpF+ Yش p/W0Ϻ}e=$fgd#2XD/&B#VYd)ef(1L1Tz=ߧIOsgq)=8T}mPerLYw`ZEE3{Wj6mP/1ֵҬz9פjˌg#ت5a_r4hj!*ڷ{r79?qo7ޓSB[SMڙgd2kWqgeX hl(7 EㆢqCѸh-? m,}og  ĢtN(0d5HLrAs? ! e Qa1!Ơ@V"R-&8*dE)ȄĞD{)2*&bKMP+xh4@-AjMiMN^M?Vc&S0u\xyy{TΝhF?}mA*/h2vX#wASԑ꜆tJAy[Q%zezۗrB \6PFz |; jjOݤbvYO'6a}ќ;raw/`=1^$:کmhQ*M',7i<ۜSG "BQ%Z)"VHWƝ>Ε͕cQM>آtjj񭍽ov{.\lFn2w[UT^TBj,0cݗYZcRwX,lWI/:KN l^O?o/9+C0!*@X @ 4b﵌EOLF3R-Msx>>! FEI*IC;#D;Ṹ{뎻fdKaޓҖT}{qa>2NJ|4p S4pÆڜTD m՘C I*Ge}0IJ,@P-h&Q 1c[!޲?N-M=NT1s.@$!I״6R{X`*0Ra鬲q-u0y>Qj=XU?ϲ'6^⪷Ďx@Zs_{yLy??,i @rUՈ Zu{ ]"]< Ni[bT^)2Xu(_r"FqR>}m0w]0*c *l5 `2#Uw&0ͮ+J-4L]a[}|J{nG^⼬n(놚N & >,JIDB2VȒ:'R<45E Ls]AyyR!ME T#iҏ{nQR MfxlY IʨИ#1@0!i8f9A*!lH|\5qYWn@͊:oϡK'B*G40E+ `TL{ḕX3mU1@ *zEP7bId)gFJH0g 5QTV{PGOxZyLϤJpt&&tP pDb¼`3/%ъḪD0niv4tδՁ/VLhe>:xl,G53[FoTIϥ Pݪêh[Wf&vCuz:͏GlFhZ"ՙAn8\FSDX]dYT* -Xȩ@p9""/ q.(&1l0DnC/ (WKM) NJ;<RXLȳC/w՝ X(} ?K5 (ec +e%\{JD8+p@Fw*FvݛrSf $:<(]RA."iTx^ Rrq&1'T,XG璵)? }<Y/1&srAdJP!vԋ E"Ǟaa,SeM_\^6a$B@m$ա9kp =r@Z 4FAx! )晑e$l/% oڂCVgϿ/ {..f sB a9WA1 1Ȓ8Mq` %vN ?]WMMZ {otoz|{Oł9T )"a),}SU1]0X[&lB\:$UX]59ҨLGi8uciF_&mb։ԢdJ%i},:<<?*%(d=~6) /@>(1]q *jBz04i)rHv݇Oypv3 %3' {R}Aٹ"Xh$_0xZђxeK/odU3du3Llf~|6a0eP0Q1ihw_78VJ\ꪱb.TMF0pRՓ W,R4eR *UJߛzuzǰ/^xUO>a^? P`\__}>lE4o7oZڦbM_]N6zniEϟerCx||Sݤ ;:^!I| |OOj1q*D A2X;IDhv%Fu'! `}W)y.6.c%vE2+ۭO(`BkfI wLs+h-R;鞫gF\$lX9cμ)PܖR2|%gS4av@T#FK{onΠzFn誸]y#I[snBv6dylYލ >`k b+t@HmojRz\TknXi'U\fΜgp M l(d5qc(4W!|M5sij꛹}e0:Xa/5mUy/ڒ2GVwtstx;{ڿD_CK0`UauW% Y5g7^Fk#|#U)x.Hbאjf'Ni2]¨d?@~2iQ!3l;cI"!tP[]ܺMCp  uԎsBf|¥&@;EbPDFxRwzhA9DŽRn&x92J#(r;X+\$JnuCHLJugŢڕ㷋xcTrH`e?Ng{˫vT@ndzL#-0Z0* -W<LEGƝ9Jw z l AB}@uY0{w8tZ2Z]?"ݼw k0v:,5ꯟ}T~8}xnh'}[4x2T=͑Τ9( Pvѳ zixsog]= z} O=т-5OcEJ -2&H] /cKw-:eYd՗"*r0K&Bcd&bB.(  FqXLTqJ'0i/ZA۠FH"u2#$`Lq]ugmd.sN.~鍛%՜_.я&C֪`yEI)}[ ˁO1JL`6xYnJkcAǠ8RYAZ"s;P(X<9|vq`6˭VQy(FqZ+z-vȎ1&5۽{z;_k*ޕBy,t BP}d4ZT`3ĕ ǭDzGCbYK8ۡO*7E{@N=-Hf,JAhë8/RYTZA-7g;)I v!TʨDh`)T:7"P,a00HH2G'cv&u`o#bNP@+#%E(b J 1$P^4fnnG.YR uDn|cݻ_Zi9LaaHG)Q:dl`sĬ[tSn 7=ckwtBncP&JՙAJ6"(REipSuHȾqH%GQrz%GQr %G@Qrzsz%GQrz%GQrz%GQrz%GQQӣ(9=JNӣ(jT90kl#9}l#gHtgҤ,}Za;R!rFAf U`hi)1H-h3Jq;=ĢhYSM)J!Ơ؂  Y$"Rkb;WAhPNwd =!"=AZ3E4DR%nŜ(a㺎qYw8w6}J,őq8ENb Ґ@"LI9&iZ-Gd%NSݱ;>[ fȚ#, +i%{Yo}@bj.IpWͲ[0L׎5xGwwУݳm]2~4s AHRVa 2LwZ NZFL&Z܊Vƛf2l1y kKN|ge5VTtoP,ˊ8Թu&|8Ukjͫ:Wm̘M;ymתj_$0t}թ/ڇn?[eJP*]69T:+>Ԭ+%,uj#/ |jJ^F5ow&k|I/05؛p\<&¨o-vmMhWcW^.s7jۓt|8 >2l?9-pVME sՁli %SG\RB| Azj/Gָt4w,gF,B0#BXYL^jʈhA K)h#2&"ݙUAn6p:M:z% .e|i&¿O+=b;yoc&PBdqglrD`p'&.Rz|+wv~Fr8Xa!y6(H^DC)EK7?Ns:g"]nkR?ۼ{rp㨬V!J{Ag{5 ]ȓ7/07(wFOl}q4S(9!!AG8,)NJRM~q-0%p; y7J 58[ǒ" Ǖ!CZ#wøkElj:RjK(2y;aLqakB}xou?|+_Z{eow-'~nw+ی34G( nIn}n8i:6 H?5IGjH6IR[FuX;!ė JRƶ==[V#-خd0/ )ZG\ w4n젦OϦ԰$=: ΂w8tZ2&[]+ gdyjqxw}A[Բa*.uXk_?[T~8}? վ-Nx2T=6̤ +_wb _Vcq>f]Mwz)4TuÐ 0тaZ&<*ޓd+_fRb  : tcn3 đZ~Of[K,z-TT,*[Ƣc\WjI10 .fdrH8o[Tvz3>&&VQR5fg>OLj9+ZUp gװ!=p .׾s!sE2 6kmmCgD)E\BBn<>3^RnP JUJՔ>'3eKFNw62v8)I(õ3Tb|s6j G ;18|Wgסk{H E%;)[qAb ՠҍ Z WK+ IM ר]QQq!`f%*Jk7^o+E{u*WM(֢c'brKM^Ȍ ^`A Ɯ\ =吹kOZq0 ͺ{e|_?Pѧde@hda2'ZK(Ir yIEt$LEt!"'U2C@ֲψ"88f`2 LYGf9AQN& =r`iZ`%XgcZad`(QHLd$&F:łKs`&TH ۰i9-sZcrZ9Lb^lyҶpXǎExBf)V2b\մ* RJ/mPqxpH*qu:K\ S#xE9QMVh(Ay &`}"LWCRsHjI!9$5CRsHj>^Nj9$5f!9$5f9lIZ,JB+Qŧ*ġnO~ xb沒H /&7 $HUxGCOS4 ~5a9)j.K8VtR!)b!*A 2Rk= FDU?l~yYTx N貨g XQ(@1Y5\>ԍ[/cMpQETwŏ] j *Uc1o }cOO7̂SYXjnHݷd4μpaPHB=(uk %LE(ʤ[n׻Nz8fPG0/n5ff0+mC@{4ͽGs{4ͽGs̀h={lB>-ژK$ <7W{weP@EEDG^]+?oK1Kf7׃ɇ)6z6KtׇAqgjӸ0 Vg>68Ujp -].T/p!)ݟEp*uiBy,?3rJ64n9M}Jy͝թsD̴j|R KxA_Z@ÌԬgYSؗ/ h6^ hT_.<c[%#Urn[%VɹUrn}xj v˭s'U89yvJgrTXڷQPV; yâZ](Z:G:)V}" \έOLO r^iAsi߼іKҾo.KҾ:j#Y:ZCGw㯿9 _KGRbJ)hqG$m3R!Ơؒ@!V3D$ZjQ  3ߪq%T8 y Қ)RR%n9 Q Ƶ8#]xFnCj ~|ƼL;0jʄ)r7]?RGsVdmy84l*aC!GX98kV욗}7a2 4`rnhXWI.N횂gUjn+NS~GϊFj/ҤP3p,T%/ǖHG%y 楶|pi#Q_sJ}Pz/ӋyA%Eh=~r>s^cEtU9I<0R݂2#wjA=ߛS˨> Zp Ӷso^ptXk<0Ա k Cp-5F##`d1t1B#4Mni0u-# ']0,Ukp1Nꌛ# ؐT0o2.CYvʽֹr4`^ㅑuȼDN^kLJ|2hcƊʽf2o1ra]>oyFPspȼPAFp<UX S띱Vc&ye4zl5ͭ6ʭ,:-0#ܘOomjڨC7xkзr+huQL}45y`/Vj?LoL(<]?=]OoJ>޷k:}0+ԯZz+=MǛQC{<~=w M~qg:Hs>߷.#տrP=*үOpFWeW?|!l[`ji[좥DGŘR J[3P"4@+g$"&#*#+-Ykg_}}lVj5^UB"G (KJ0C\pJw1nEfX蟠{Hbo|©'Qq2 fiV`sT3ML-7~R{mpܹ=2*Q.A6F9,^f3 0/% eъ@4mC3;apb\Y+;>ބf&?+&n x  yeԡPEBMsR|2UV^0piގ6z zr.MarߖR-dژRS-MMu D/|G[{Z8߾*!Â}^"nP(PFc#4!>4EĴiVqbNno$\.%6!`BL1jY | \K?Ggmo&ΞQ :?1^n@z(o߶c +e%\{JD8+p@)e IT$2irSfρI/_uP,{"cN:R")98Ȫ̰4N+!H&v< ྕ+½ æm$vƃ8kc"J/>%>S$ö!UGKG@3쮮ҝyYטj:(L *NzQTA"gXT`AA'7RRM0!6МZN5BtD 9FZ 4FAx! )-#H9m7jptYVtvi9csIp!.3;}l!\?ڦefOE*0 KbwCl>G 'M62(c 0r|BLG'2Hp*pq.Fߵ0|qk THQnyaJ%EYz:?pݯ7ŃWu0 %k7UYW]  Ϧ=+3X/B3i+gJťwt]5 lsuR96a0,d`0a1i@d5g7kqBUJݳ.Y5kZ1j"t?,8$Ḿ|~]z6tw;8unI1T@FW@7ӿ篺M.&{Y+:EoN 8#5?>"uM -mO}։pzni "{?jM\atb(oәJKFH_Wr 7|H:bN{7B I܈yޞWbZQ`A< NWg6m}(m%c*Ƙ14 E}ycV}%`?[y4H)ʝ`L m;uX#Fk","i A`'$1ŌYё$|YzW؃`4OLj9+j$igURW`q=wZS:i;u\;f[yXzs[yh{"S/V{c|mwOm˦~_-#XuW5h 1 !?]9&\ =PAG7>i%'`㹫Yb ǼXtƽ4am^w̯`ָ51o-ӱ؊P[qs0ơ5y 8 mA*TX{YE|Ԩ׬!/o.}7Ep䦸A.N,>fKBg޴nN7`*Un4ۀ ]zHI3.քY}WMao~ԛUӊMircڀT:]~#GQ`7&>Q5eZ3W5j|Nk/4Qo[c 5ߘtk%܃ƂFy 4_ KsTfHҔ܁v]7|Ygz{(uݎV{=㾛8 {--7a߷Lj SԌ/45#IQN)93Zd[㫪IHo}i!kcZڲd[._ʒ+۶@SB=6½vA)wqjw+洠_zqoփ틎#{d@I)ޫՒ@:b0l*%zA+uNJ%zrnm0׿>n!l쳊ʃb_b-%:*ƴϠf>3(* %B# MJKD$2pDs$rTB@Ѵf> (ӏtlgՏ1DP0F+ aR{MAh1nYN蟠G\P E{@N=/VX,5 H3cHe1SUOj&϶5(TʨNGa"hJF9,^f3 0/% ZE_Ikj`>("bOP@WFJJN Q( JPcTI0`r[}-5Nwa'υj#2=VyNSDX6`JTX1+Ŗ p6uoMpt\6 y(wC}A$)Vg'v@J^_mDZ$$WO{%1מl;v+Ppy3mL* Ƒ@kPj{goSחRM D"RQb4:r*!qKC`R$LLht Vu.\.%6!`BL`$,TFYJy>%(tkYy]797{?<ؖ FցO +e%\{JD8+p@Fw*FvR.t90I`5ZrןMڲf25z8^TxGDJ<Τ"9̰4N|8"` (O {9Yw_~, w>:3-n0!R9 tP:('TBу`ϰ02܃&-? On/a$B@m$ա9kp =rh.N[CRT;[Fm7bâbV^?dC4ovi?8^ߗ"sd\i6.C $>Σ`ָqd?/n]OSxoo E"r'Sl=cH)D0u.e D]L&o:)N|l0N6`M@ZX.*Wr¦\:&: icqλIL6XɇpL!JX̾\5E'GL|2O:`&5/6@W\ )J/Ln6V4;+_OaFZ>yS2%ja!T1Ƹr&wt]5 iIЇQŤɇѓ՜ŕwvJݳ.Y5kZ1j"t;,8$Ḿ|~]z6? Kڷ ] zqz+ ~{w}}ӛW?y={h*h x~}ܚgZSSŶMND\8x=4AQz}~5?&.Y !fCK%#$+9u=7b7Bνލ]FTzݵ/D af@U\u:;(we}iK|6[2aCA0\<.їw/8nY3xuwHsIM~ ֏p8 w߶S'eJ9b,RLpJu vRJv##A~sFGxtd! 5e3W {0L:FP/)% D$"v PB=|h+W~QVm-:eY|ɷ"UռT7![3$+^v5اsſ/_T镗/~Oޘ,*zp=<`!2?\- ~GA1oE2$+(A_ n~1KfZY7Y&۽"1i8H*}@ ` IvqͼwKuU"y,Ax4 c!n/ ՝Ο? ,6j4;x0))L6+]8Jw(7襢ɀƖÏ+y÷"$A<3˧Q?֐6<@rFϵ,wJp5az5>>OV<&*?Y@HJO?dRuTrL9 sY04\`D$4)Sj-A2ѣ >ze3>(ZxTSg!Ơ؂ Y$"RkbD  3_kKp3"ا6,L/p8%b"b΂wDðq@ymL=DɰͅTB!q vDM×y޽m_>l:-uY@}J,őq8ENb Ґ@"LI9&iZm@[o_N|5E BEWd8 &wh]Yo#9+F.0$[7  4h=< $[VR,e["CA}<߶DQp]01%N!1Hٓhցà-9B iME/k 0z" Oyns9]$r (iB}Nd!OLJw:ٛ盲l~1?c.dy?qlU;s2[~|X$xC;n;L֬Xza_zrsۭ]\ڗȭ&SR؍몟p3A_T?廑u1=޾ QAqվO%N4uߴ mdyMfߓcf./_E'8BU tWKarCN. RZ L%j>E-Bz$:ȳiL' r^ * #H @.J`HS MNTY`&rp[TG䛒p3xǏ=(sC?&^Y_絣~n.d3 ag+!P0J, k(O (>DI3pAoT}wmރy%oAى2a *iH.- +)TX=%E `[<6gn5MՊkZqbjm֜h:ZUcz=~.݂mzrtX;aqz=ia5)9#cXP) 6p mOԧ] Sr:h̩xEPf- !z^O3PZYzS܂NyI`{H~*gg8h)آ|Vš:$.0H2m 0jLA^xV {70~>ܳx[2&]' $yEhwm g̏/y`4W=Zj&_c-$C~fyUVWk[FgZf' $2Vjd>Ez.) jT^ҽGJk!>*sEkֹB()b;DE`KZc;Fnh] S :i냳I[g#oC0z]n8?OQ;9 D0ÙPE= Dܧ7T*jSτJه~1:PGo4'Q0bp '[C,^6ž?s=9Ki?hu^#SB 3OeWDdЅ:hmtk[KJ=gԵD[X5oՀ^oIY("a}zTKŻb)L}яkh7tgLD“b2ΏJ.'}6vj}{m~ԵK3>&(b(2QLf`e,b%Ȯ4ٙ=C|9@I]xy"X"BM'ٲa3?ƻ{۹&)D|>Utp] |Nrg0$Ŭ *YS^ !H[Br+M)X1!ؠŷ&5i~~A zi%1%$a+Gm*.ܑ9aB/^VCCמzЍKP0Vi~qڳ470?MgM7Q *d h HVd y]28Sb.M%c*\٨%0̣Yb4dk #y^:V>bik{Cs(87ApXY*B !lUc$wVU 'N۰iِ֐֐vL aGEc9q7x[f+e&}d_krr_[Hf:r|;ygo?nG~?$;f?F_Y 46^`J&/.8t dX+ph<̕"%<m r=6T  5}62NBϕ>>^̏ ."-:q۽g#M7y-`X_`j_dZ_0,d,Ѐ:[%d8^^^5hL19GUgTϪk+?]mcSS}IOl#oˉ[w;(bwPBYYY٪ݜ d*\gNUX7M'#6\^Vm޴bыT*]C R;ܿ}}sLns4?At/gt6,nY޷ͧn>)6W'oʩA}N0+,q#,ebJ`sI:P:ᒵJjF{I( !'Z77NS:#?,^ѐyrY0ޢBؾLf*[t\K6Oisrk[M={P^~LľGgP{ z9afNg;|xt)7 =m37E2~ww=i?.V)1༢Ǒ卖xϖ~]c];iK6\oxS@yS^Ԭ8=m7p6znNx-7~ KY_5q&q?MSЄ$"بLHʄJ1>!Riuf&$=?z3W8}ru,'VW#iճE!OGRWJ6uҏ^w``U%q(ꊨ]]U*5o+ !+"X=uUE7uEJi]]U*W즮ޏRRWDn0ꪒ*"*M]CuE5z7ױPe_}\W׳c|׌0-߅-5_Y_ ꘳ss-1a{N^҃dr3\F>e:׎9MBDC iC6ʑ5vA\J%DZ|_WWc4b>$){cQ>9٠}FŸhᠽMGDct`1 29o9HcʁLsbL#;0q], -2(Uɗ(u"QJGvev -hT$:N⭓x$:N⭓3iŅЙlLP*xgjJ噚T|<ɻ|As^B_=fm@zm@MfCW=f4`6`m@E=fm@rm@=fm@=f͋j6{`@ڨ׺)sXlT{yzl, W^Voy(y 9m ZB&EI2AN){O'!'D6S *Bd*lsJHCꏖt0gt^sgtNOE >32Pp"74f#BjB*W "Ĭ39gk2Mc2^0%Je2ǹ N_!7V {v3{#gYu蹔tkK?}.DJuddSۃ압WX32*ȗyj7,_/^z)˧Fn644:r>Cȉ0E|H.}(lֺUɪr{CT#C lAN"(sH?t%]A53)RI )Y 岀$b2!+,9[4&dŽS=7rY?̖[搼@ei@K20wEpXTR2Q˝JQE%M4g$ E>HHK,)"[u6zd ѥ ՟oG-NjqxER4ZDW/˘7Y #R(,*Y "3*oqVwKv>_ x׎H >% JE^3xpb _豹+^(%DC,4`Ar ݐ!y=7 cqr\ћ3:=n~U"Bt nKSK1 (ǥP:G,ZiMŠޓM V5+O?<Iib,8SΒͣĖ}GM@X/I#x7Njd,%WGPT )␔4I XfӗWUNȷL]]ף>KPx*qБSsD E^PJ019,b*NIu+¥&L4U#Z QRk zqy5:j8[&!'PrY6}.xs)qY4ީId!*H)@s 7)r-Y-)HwD@L*"* *U}J3`G$O\ʏAV,O^f݆=Z*Jd*%NzQTAĂOgXT`AAL{!v2?jچTr VE$[$b i1]Q*ʐa[Fzԙp:v13f>.Kp?ܳ7-DO`ϥd'r&ї Ia@/525JP^6ya޿o;]w*S~7+Sv=F"rrNe: D݉)pe'I袹7@h! H ˥CRÕ\ܭ黳"Y0jub,>V#3aRN/mTY%Xwӿ+}{:^(:VHPFSM#pUփ&6߀gwPB*?fԴ͓uw>T7\4σ{Q()g2_8_+K9І?$A><a;PKO֞_U[7['> GL4h8.tyg?Y79Z{%h}I6W̥c2d SIs .={#~OpoTu'ࡊ{gs ?g?oξ&_ߟ}h? Lp6!' ^ViT<7g"(J?~/]JBq;vC&̝+@H_CӯȁXG^]4dE$i/߿-+1Q ?ݸSr'Yl L{ %jtq F߭@ v6%J`SC{NjVgؔ_a6$L)G֖E NDA[NJIc  yvT1Yց+{=@N[YB(Јx4BH.yMÊW&^Hc>ji)jV㭦/ij6`YƘcd|=ai*Y[DH(򅖜S0뮊4ۮZ)zڋԮfZA,,>&r VP0vY4a#A2 |%mh" {h>Th[6,r%Z˰rx6pc|g^,2UAAE@jO[ 89dgjᥜZ|vB䌂.`Rb )h3z ۷Tc:zRF'qcmEѲ)Uib 6@0jfDK1\AasN]L(9!}dtY짉gE{Wϝ{$x+)8^%ȸA"'1h|ciiHN4 A#2>yxwNT{z-`<_9W- >msP삥ff_ WJ웦YZ̳Z&#GFί>9 TDM}75ygo",E.F' X<zdJ_2ˏ?mbǣ:6J9g E52(,ҳ,R Ùzg޼ͣ7n)P\BRh8LyVqhtѤjo[E-˟JwYMPɆFmRwW?msy8`w{mf<,k:^n3V*.G{lɐ8'ꅢZuWA.. U68҂TBZdIXL4 K#z\Z/'L~{}2($eZ30k1hFs+R.*N- Ov2}|[mڵĨuvl})Z9}-&-*q]i*1Tzf2gUэ&ѣILWpsc%AZ9r *M6T Sna H:vޓwQ~乖a<y oA󀸷 }CW7<0NݔG_piN?mm{m˕wij<]N?ϲpOCŨ[]j, +BJUȂ)K #.(!>k6d =[n䤊勱5޾>eV=aR/tLjd<)V2""&ZROH8Hwf-[MA^OhͶùVg;_Ef?'敃`݆tANcA-Xhm ~etۨQ1,%! j C _)cГNp-vk5vH=e?0)zMȺN`RGTPJm@fXP9#l~+ i=Y|pYd.E`0>6# -o3 [aܞ+EǸ.8uX0tjpb#VQ׋B] W7S:eXI\(wp)Bp jg>OLj9+j$>cn}{0kz.!yO-Lev[h ш<[NOQG Hr *@R "2XTI1CQ}U'% >ev*8!C,T`{} ۨbJNlyplA%S_\Nt -g[qAb Ռ"#5C!H, *W<Ɲ'0~8qu m;]{2̿>דd◅qC~с%7iQhL%ӂ LggS⹲β'-˞\ii8$h"a Q.5a,h>hT\BNJqTQ:ˆ"10R.k%'RHY׉:fx,)c#EN(28a }c7tc-Kʵ~V+ }}8d<^H5H:`w7"U0laV  6NqƇ^K(eK|*S-dIT(%aF^#D`<"E0I45ܠ;A  }00p#\ P)s-}8SBvȺt>2$e{+Ə&+ W#z,^h8liTl0۫ laUz}B. FQ8P0eSDM1l$3Cl߅qCj.cdo{x{Uit#Zpf󫞲t}%zzE3eMrB$# Dg|8>Xr<Nn30 b <* 9N|0 L`HZʩR ˽)&.P2zE鸭PG Vp).}6^x#32x5sf"pa'RC-k!&[7L@]ޖ`z F_re@hd0'ZK(Ir yI"R:" df9 ȅ4*TPu4H`&$Z`B`#6HI{.c^i@ Qc{HLdu4(&S kqZfN˜9mvLQ<\̺ @$E9=oXcs{uϱ9=C VxϛyZ6f< sh⧟/WqOhrhOH0!K|m904(-9/`NvC;vW6=m ;r 3'2d9HE*L<iX{))Jnpb΂!#]MRAsp"wtikAI>"[4s|2e3g̘1ǘKu:3cexfs eGˋ^]J7 A*kܷ8=J ;Ҷ]ȸME9ra!I%xt>Zd-؍#):o~Xp9Q[kFԨb1nӡٰ6 ֙M5MԲ@M!ٻ6,+.)/{x1^Ĉ"9$e[4 J7::Dĭ8'`w+7 +U^/Ӟ/z(/\پ3*mntTsO/ E/6Nqy,Ih5tzk=V“H#U&ZȘ/x9ǽ9 SOfgr2Xt;1(W3uy^SgwVs= Xkwʭ% :_b2eA]",F!_~:Y3d3\P{2245Ud(SL9EFe;^0vOYn/ի<⤃nIE 2d%{Ǥ٦9jE}XIT.˲~Ozs:x%Y{ӿ$/ү׫W\Ͱn:}s:*\~).q-Z[e.R +1=TQyqտ˲T}݆kBY[|hsbB%KO?+gko㖊 BX!sdGr>XIB& Jt!+K?%EH .ls˒G!}~ZgF?Y{[[ܡޭ~Z(*s 9@Zx"Y$a6*ad1X}IJQ%JFEKL"P۔USJEUopEEqOv2MJ&'=ig`Hk}? CU!k-X%Jb*nDߤnERK5_;St|#|5ksvY!f+LV' )[JCU#bDTf,cSIx30fE̍NYVU Żp%D>W,[áuLJwa׾o m7Laa"[>`w!\Ј*|˱y2ឝ˯DGBFؿ.Flќ?zsI OnY(KƝG ^%c"BJ&fW5A!x\<>oΚ {TeEM8U(SmZ<'*JI&YڢDsIEH /0Ь#1̇9J0]RwHRՆB6S 8V s"_eAsFPK|#.ed5#(xK$ֶR^*(Q3k,I:Ŭs)Ȍ94J4N6jڗ@idw"+]0CgV JBuvRQ]J u;"T# z)πVQq S4"XVPT@t[ x4shltG*C 2 _.ҧ_dp/S8k#4g,q YfB4VX4ga)-%,F(Zex@9X_wZ ZpW`F*9c1.b ~ [-t&AƢj&2b@f Pl\FƑƎxXdc vEm%bx%Y j3xW9k+1h a6 ʑƎ8fY740MbDb0& c{cYLqżC9 ȱ2ؤ~ήZ*)!b:%a7k,X \3kGRP6Z'O"AOM!J%r {fS^J'Ysq yEVdy-Qd Hc UV+tʰ*TQ]k?Z}p.ihȀgm<_.[[EľcLDq|1RT6z;C8I ">ìr L&~1ˏw Y0 nPAӮdD="[w-%i!"k!J!#/ፍ@c&]I :2i$\:p1'8;獵 5:/3$F>@H$5T^e^+]7x<{*K dc"@ Itf@܆m}ů V"Y ԏD?A_EYQp 6Yr6e58x*݇ZH >,C,[{Х=PnB%| \rjI7k)2==CM9z"0Zt&!:v,p ԁ`(;$9cMm!ڡ ŊV N+}KƂAa9(@O"}S8$}]ycAIpI`"Q\rP+HƁgU?% PBM}LpY%,JؒNXhͧVr?3 M=ʾC:tf1ch%2u-A*5&mVDEK,6k,e4Cn$,BG l`_tвW S`l-4o:hym]|꟫cڻU\י Y* @zLqt*MN"i`t:6t fLS06,,Vwm+Z)DZ&s(Y<0fL0a#ҷ 3b# } A x Xker݈Z>? ^ JTL"uFAvPX&R4*DŽ*m )7!kVNm.[d+ EISK᪛%r,JM3`Y0p0 :-FjQlw ;_CTi,&WQth#`A'p `NmU;3"7 3H 5*Ei H>5ԓ^Q1W0-l]H΁?!]D^Ϭu~ThU_ɽSfHڊIcVqڂi#b> RStijaiP/!WυiDF9Lr%q=G6T(e\YnMDCN1 4&J9"0i>ubEW2!Z`4rϠvOp]g 3%8 N.YcՀ__]?=*L8a&K.l~ualE* Z?UZumo_ݕi/o1?XUyoХXp …T=MN" Sp"ImϤF^m )J5J;WJB%`m]%F:VWt+3D9KA}gQzP=$GyNj?|m;/;Vn_Eld/Q)\=I-4ڷ9x}'ȯfyՌ%_ONϕb4:M0޾&0(6x,CG=S:N+++++++++++++++++++++++JA:p/?r tP6&IL4Qnz2Z|҈ 2/שn^zkG+0>y}[.oexLdmNBA aƇ,(E!E[`{Jmh;v3prOyϦ[|'(=~Z3!:_-ު}i뻇U~nzvBTS7,nw쁡UocbftF5:ݺ:?t}{ 3:9|a^zZ]xL݁A7o\WNrZ{޿x}GGu~=:8Z>};\~`Bs+{םXr͇tJ3*yzzˢ <#tTZ]]}7{f=ź&c!\}z]e= #jxCfny[* vըtts ʋT:QJT²JǏ|镋JJ|i^C{OowAJ!MI)?[&TߌU[[gXlK">Gj$F$b]Zv. k]ѽE077tqӁ[9r6Qo {nDyxws}mo{y.;|Lbuֶ$ʙ'+;N ](S_B-ų:is&j̞S3Gvt"೶/Jjڪc::fxETCEP*I6s؆sCصlr[ã,,9jTyO15os;mKG].ҲT=P/9Aщdlwm_YK[+ R'=)4) -D\KJ^zYڲx6p8䏜DZ#'At6mҞiyin ^y?= C"A`fa9V:NQz2d#Z+ &w2gm2gUz7fT| )ŭYH kc TiLՈ#F5<~%h-G$ 3C~`Oh$, {!"+T!1g{Tj AOI 2rw芡k4ج\-D!SSEcXYL8ؙHhA:ъlGQ! lt~}' f];#8OnY+5 fӶ6Po8x$5`$SX.Dt788uHmDDy>ǏEZ*nYLPklfq0q>D%^+N=#u3$zC!Ǚԣ=X8ce]<hMBe0O1:-얅'}v+DZ=Q}uovneWosE^ Sӟi؝M_5'pM>,b yX> Wܸk1XKUHL$RXJ˭ K.Z˭f5`I1Vz:gKI aF}v?4{綂6 UYlF\f$0݊"a͉r㜞d#`B `m"`e9L'c^˥'),o)O2(ZHi#gAcQWXk%:3˂ƫTQnйVqJ@p0 C+ʺG6R(b`#X^<*I"'v ?NkiFC\_r/9Q+2Q1"&~NZNWѪy[RgA{G=s! 0绫A*ؕAcEޯWȳuu8$@f`40J('wFzXOy8Eo<3xK[ή69z!ɻRMLjx #5( 'H>Â;oFdp0}>Ud%oUٳG%`267oy{ k ,!hJUr#Ҍfwx*V+ /b aF1K㓚.9(AN6U'n` BF6$Na8̪5i8lp+h83zڟ\79eGeQliTjTq 4א/󀏞zh8MV juU E蟝8~ͻW/_?߽8^zwD9:| C; F݁+CC0|hnWiU ^?,VJ8P~y1/0/Y -/א)iFU(?J"!(滘/B| >d9p+b nyt/aLﹲQ}vɥҺφT9ʹz(y}~c ?j:! ]z:$be$pe?6ht3P[O{SA)ff!bucN5fJ9ViR~m3)pVwL݌}mpZuxK-1q7mW@ ?w؂KB(HoFpk7єۨ1o#;gcm.GQT^ON}A4,&pV~SU9+&WV^wpyPoo^inpa~Ja?xeEI|Br!gЌpQ]`p+PZ`4J^%^J_ԝ/kʼ~o Uڪ4yJŴiŽG4*lGK#9iQuɜf\@l饷efd: ;\Li|-`$IK **;&Q:3"s<&TIn\qDޟ ++VnJ#;s͕+຿wbb޼׏fٔq94,WV)UėвwWϾ,2'ʨ# f74#9\y&${CdTZf#y!@gI[G5DQ)b"X *%Ɗ'ieq!Sʉ(%Z'L!zK;vd! E*2*!FI4|"}/uJZεJumQ$w("u׭$T9\8sj([hUYlƾI"%acw8p$E6AvDt[Єu:e )V|-7\71㥛Kon9`dHQp܅6rp+.5̲p\q7CÙ_c%P i@ Du+ʺG6R(bM*X^<6^$DLm1K%GXB&*FIˉ`N"f|n rH(RS}KELMEީhޟU-#p7{ OX[b(NiR>WI)!E  a̅3Œ]Ws"+Y6ljPm'T l5 $ 띑1Szn3Οr~ڛ:ͼ0&0gش h׈>hZ3mSS$N{tdn Jy!h<U-,Qez] N}%k3{64'w=FpPB%Z& ol "oZM,:a268ЂBr5Nَ۸n60ҚL\ "kWNk-nI,w'0EI]9_[]cVƶS:oD,[4m;`k-no%Vv6 h[%zA? p σ=oXJ;\*J]K/`-R)Kf(Sx9siW2v.ʹ2i;yCM3qm'ً$sQԔs2-Ilr3r>|Ϸ9Z%]w;E]Oe%>ힺf?j I xoby9=ލòpw|B)xXjZ3erk#wVOq͇< GLud]wCS&qkixtp,梮[ޡ,GnQiRABa|"q{_ٺ%az,@VUM$l+@2$i$=iZEW N^C-^)}PhlĀ& ƢbTKPL$UM2tnGqm%w}bJn+D%8֏Wdɻon[pER1Wh4Z+X|~hQ=Ww҆+i@Z1pTL9Y&jIT*P_ ]X! FKEc1G+:RoM!.4\%cO@ lׯ|ܒ6'*Y i `,mD"GQBtjٔEp:rY`TsӊQߜvv4llDO2PZ=1gUQs񦩃, KŊ\ar* S¥)A5.őhlg9ΆF_ƀCQ!fe}kuKC&G]#*(JCTy7RlNzN(kTj4&e@tҩ>XQUplzEzVA4FQTH#d8ip$*k*gP9D#_(#`VVI9tv#3exC9:.cX$t⍐nA3sZ'YeU@}JVrjʷJl>q su6;6ɧl"PAi `QFbtVC796LhU( :is:[C?fmcMaʣ7@=-d0&(D{$ڬC˭=FѢU u>*T9+st%]#ȫW{ڍ4yh7ii`gדBY>ĂJ5cd?v@]$( $h+47*>tX+I9g-I˸ 9bREG1Z3rGk=Ը7VtOjq3#hw}[U:m/?x\uhw?BA=k:ޡ)ۦm3&.Fdj.{v90y+H 2"YwIF#{ <žvjS1]0ƴG +PY89? !GRW2(9PMׇZ%jLty_Y"2< 1:gKֻB7q?$BCj|Ō5L{;fޗGFDnҟ5^{o#9!e|y:ߐ~/q*E[Pѣ@IQ䳅T[L LL L LXR6TPIz2;_|U-h$P&\jEA( d-y7B~PBߙh]B^oůeCʝ$w'Z/bbzQ/īܹ4jjz&%ڱ;le+rɼb}fySk &bNj@\:FZ1|_4-kC~$a븦nŔUF{Ϟ"U7Va J4W ײBjg8n+8ݚ.hԞl?-FKaWyg2m_Bcb* >R,UC%b%ޅZɤROAE ʲQTS) Fh"T6:uZ׏7Gym˽7nz ATR9wW%s1XV` &Z|j5 Ar a⭲Tr]=%Wc-߲U]OIȸ\}l97!+A)3vD)]j{GA[or}r_&믓r]p&d:Fм.YcH*G//mMGq@qAaaXR$B֩ÌccE& (YYs5Y&: =8ml7dc!bP+_kR1[rPJ ^Naj5=ll!hivLK;k ;Q=B}v[~f weo~R+uVB Vd Gȋ*;juS7|,\u&ArJ)hbIjKh=[?9V/% cct!5p=xX> )͜ڠɄ]@oH19MC}nn/iìZ/yyonO4Xtgb>gM\5gMJ0cwvΟb lUWj9twդ8w讼+jtmҢjR՛qWxǟHI<5zV]=MZrqWOʁ+|]=+&+/wu\UhIitW]2h+بAWM\⮚8xwդztW]iE Rjj&5z AgX`]5qlU ]5){tWhksڭ׺)/|/cWU*jR)ꄪt 7lLjHYh*a\R0's-XJ5{tP#z@%v,Ed }`m d^~\_[}n861wq@Ocv-._+ p/ thBE`}saE?WA)Cy! y2j U6hY!h)QU*:oKЂthE"*|^nN$]I"8F IZaBZfAX+ZvJO_Rys ts ?ZQʶm`v޵ؼ,k$]1s8ˇu{^[ת2e^"?͋tDȚP`DYD:bb+R,Ck&'Zmcιn"e-ihgRΖom q}w鷌YnzawsQ%ZIZugU?࿩T~ݰnU׮[vuM{g[Yrf~noyu{˅mg|9t}TV:Xs[7ݲ.=$ڻxf&ܭm཮U_e|s#4x7Bw]R㬁UJ0.DTE4w|wS=k EA$@D(E,JXDoJ ^NTlS]L S3-Ry( Ib6 woFΎ4ԝMO\([^oxf[0>.\٩U{d Ϸ;B)6b A@ȶ(ЫdE+aJm՝˦d1xK^J}4$oRK$kZ< 80BU%9%^为ey}9n)zb,_ȳTRc{ϒo&8wS3"%SWx@w^4OQN=*}~<*HÞxTe{ꝙԚhƒ6HRD/XvbQڗMdd X}$¦Q蒭t@R63%Z$GJ9E5EuZM!b3rCĪC^kG69dr\5zNimfe䕯'hLa^>rT@8wVDr<{U YfMDtLBE̦'YLrE{'=-`f!ycDJ9h`Jc9 VưD9 (&X…)bLHZY3rv^~bQsN< KIYkj&" bak$s&:'@DAO0SNY4 1 BꝂHx|?8pZV:~'] ']As h 3@ 2$),,K'Aq4Q3:8{X[+47Jld)rr׹YSfxN딕f*2N$o1")J! @㕷.0mS![~ѳ}f,Ps-oS6P/Aj%&"I# :eT]79@ģx\J 8ҁ Ŭ -]ɣ{r]6%5d0$T6_QMJ0g\jjWjLC o`V7ʍ[k*E+IԳK儥2 mkJ]°E+ti|1c$| \X/検%aAra=ZЃI% YvcX7Ǟq6I=ڭgbΧ 7u&z JjkO j%eiC$ &%.'i2M; Vl6FG~sfdҥ6% 'iAj+H٪iol(cv۬1; >s&okQŋq[SٟqkL#N?**4%%?E>Tav3_w.+}枼!ϛ&O x?Nր@"5.R up&>zxQ0I_Y wk-[N IѸ$:cW3_ hw֓ҾZQMsp.GgW%U~b^}GcY;]n `"*?ؔ~`[q&Լg{?DGڇ9SᇋVbP(txR+ʮ&?_]Ҏi=f&/ڙ?ɺiifU~a149u`]jO'BϮ<>9>pΪYnu*k":a4ա/fʨn9:W xP4v<_#f?_>/?Xۯ?}_>H݇_~x)cH'៏#?;SwUywi}p,MPZ|I,V,~_+kGydWr</V| 6"pX3_o뤸~t{ ŅW;g\Gz~1[Si)#A2\&J,m@TZ0޺"'M$՟avT)G)߶STĤJ,t!R YsG_4n !:ё&SWY{~" ,(y ~+K̿97+[=d-l"K PM6xE+x#+SAo=tM / V;dvW-|λ/٫W:EtZA! 1 \?:ovY'|vz#?'%j$cQ$ f“AɁR>[YD(q-TCjwʸJr=k^Gճ<MF|_5"dhtۼȽ6^kiryTlf狁{w|ay烐W۲xڶ5|6o[}>iҲ>bi-GvDȚP`DrcN!F/}wEe(1M Wn\ vTVW?տ}:Ǥ C"8(),HT AxJAOAQDk; )(@)m4"; nFEu?֪]orծ^m'IK~r>O 92Eo_ŝy-_\VNxB0FJq}'&gH7 [팖e7[;=MطM?&Q{:IS9VTѲ ]gK_d~Yl")h"g׳@ӱ;._a| m6T߯gwl(9:\#_#og0ryF-ǥvN爋2I8R2rѣ0p茡Շzc tix +ڿETi.n=~C{lN$e]bٛmwq8H]Na/c;sߦNyv*vjX`h3 OwHp{-Æ\eW2LG'2 Gzcel5.Q,zW懛9Z 1U>l3- #ro],vfN?9\1_^f&*n*n qC 4䞇&(ř@(q&eGb@!Yֳ d -k_̗Ҹ"C G< ٦35yߚezz[q9\vwEȬ<07<-81*( Cxbjj']DH6n7T,mKȸ~KWq^ܐxt/s6GJ?W%48ַW'@篎æk>s']2l:xd]@[&wTZ] DŽ_1j 6ыjY=N37Z%^;eM5e(aTW?sK?TP~(+;ok.¬Nsy Jec`A,6YRDUsB e&xNL~ >7?"-x㵽#@5/]sSȉBCkfi2\3 B r8̠L C 453fiLg#2ƜBj WJ' !g$`وL.sWZ.JTWQ\Ij x;'kv+')2~8Q cdiZR+oWU诚F <4$Ғo\NI\>Sp3;4Z8чi.Õ<.#_0}0T? R%ߣ>Oni,qOj|s%0E?(ˁ+_&(AsMb5Ihx0⠗ԗ\%BҬ޲Hrw&RH xb'.rS1Te`B]Y$(H9w X#cCjVM3uфQR:վNST:վNSTX:վNS 9J u j_שu}j_M#>`բzs Ar=Ijd](Nv2ɆѮAh$K:[ 8xp2']8--@3^!yBȼD’&$G2$D"QWn% ,bnHemr_'}vn"~ɉlh&s&8 QV>v/\X=vҕQYsJ MzjhEUch aPqrZ toAp֘v4CQuđ<_c Fya{tu9(Ebh18o3 AB0*QYt0pR9&xvϺOy3掽*3 x-$$ 6@"w6ibMI 1Avw #SHG4ǿh@F1.:RDݝKr֭g;z[>RO^[q t>o%YAw!}'/dwNd-?͢R< 7Jd`"Mڧ`RP5ώ٨A4J˨WE%*#9C" EOkB2.o\XR-lB!UZ2#gd|J9,,,Uh6bn)Y8-K ol{>LOE6[4OfFs,ѸT^7*'F4:`"|L+]'Xl9kʁ&*$iT@]&ϣB0Hdi/J@Bx-_VU@3"UpjT,e ༆(G= ϩ%0eC"""H;k;,zt"jWu .j_0 eNUE3N\eAgkZ0Y2*j,f_RRnhvsoʊcnh\ԶqhF& o& o& o&먗33+Us$QN9#-,8񁡒Z)(B\kJ=!&ji.#ygLD?^Y nc0{t0>(}fa4ur[/%/tU,}\m?nM\/ФXڥ-ڝ~r?&*n*nj_m,q+hJ"<7x}0\vwEȬ<07<-81*( Cxbjj']DH6n7YHۖqǗM{J!I%*^Ypq,1\|hnqF7H>qA^}oq?CQN+_M|8xOd[uf?$yT˄-ԏ;2@V1a?WQ)M"-lj{FKu߰ċy3%j}gnCx[=eEux-pR`mr5U8inhCB lL,HC%&^czNql _`O-f1xm-/:eͦ˪A'Pox 9qu ?M=HY<!2 CH=UD:@Qp!9G)m?ZF'g̠Α"ZZg@zgu<%Ufu )q*'q+&ʘ7Gш$u^GB#sf&QNy0FdrRGΣq!1"vV*NY=Ji J490G8ŭ-2cߠN]fk_}ng~[J!:~[Ҙyf1GG P3KZ/VU`:@S@f;߳UI74 \$ihDSICOɥyXL KE*8AmlmdywL^+")0( *Ź'ID {ɹ!j™ \*ʭ mh|уUE^ >U|0 Mm&&YʙN"D*8%9uIphVX7Vx16~8Ѕѡ"mm pH YBL=8۱>щPKtD;Y$`06YX`8xP C>&06" ,2kJ1^|5+fCc=<"G$ J1A re4J96G:-l`my.g?^tǷ N^zM6 "2'Yc-!c`sӓj+tvQZ F%Uc~u㻗Ȏ߼ׯϓyūSǧ'_g`=&$fnt֬kaĥ3KS2;o'^$W`7a6n!Y}C'_Aʑ/ QOX=!H滘/B| ܯd>^JA4 ʻqcU7U2*)udn,ʰy(nNiط C^zDo<1"ʰvt`pRnN1Z{fq8e4:0SBz3$J9 ȣԚp ´<^9h*G%qR  CDhE)MV6+^-BcV`][[_Jڳ>Nq׏ԲS:l"b< j+NBa£0"֫VJH-8gKtg}) FZS 1PKwQO$с>ipT!*NBۄZ *H1[~O^(u>~gek%t'sew{C_2k ^=G1ypԱȢe΀#21$%P%QSdG8l9̄-ܨϔOs*>--sk:;$*s&NBe) e{@+eDOw{O:_"웎 AhH]U;ek 3+n9扖c$7>~/[`f;+u-v\}-΃y, WgIXY;cqyKx$AtՔP#ZJN)) X3#GεGNhc"1MԠq@ :g(@ Z0*QYÑ'sakW*돇W]C|jU3f)n/C]U?o]FwT$,% -pjfhzjozp9.lr7¼D䭤C,úK}utQ=bG7wbMQnya8^}S_#[/}P}UEaJ/Pkï]7俹㺥w[hOimE]T;dW}sO+e OuZÍ{6jwGʘӮpㅣIU5wA$G$Z MAfDjmDR"ϥ)F|j75 \{` D8.DxP2#\YGhk1->t/GrkJa-}(d|^=٪^_2 9i 23$ 21eaQOgD(hŅ;J7vh(Xj#PxI.Rg:EX1%(TɓԷFg}rt}vss\STd_,ȳ4Tk k#kMmڭawPLgقb d\2[ mldj+l$=#uS|dr{.*S [2W@h3RWH0٨L0E]!>oXRꝺՕ$Oexfʟ oO.MaG{SRRM2*L' b-S=O2Hƍs @NS!:͝61ʤV% &.F9.@e 'W}&r P *aJouz޻JfKc/{i/o<X{H(!G(*ͷP"gd(lTE9 m;T`?dV@8lҼ;㩆 {ɰr#%P2&=7)M{z#}/H$\|@x Q-*R+@I9E !1SV*T߷9wtgM%L+EѾtӴj|`ۢew/ꆲQh8rMLrf#PfZic HQ[h(+erLwۚj9ꋓ=$-'P4pj1?>9L(%MnEjY,I+ƙ4"#G!gQ%`HLD '\zϒ@EN-Yk)gW;lVFRm Pi"DZ ǓSZG :5! p<@t'm_[lSE~b(g@_QU`ɈAx+A#~PQ;OIK;d/ \$ihDSICO<(=3DxS(*O|!%{!"^kE<R{:DId0n[2PμpyI[{+"H[.Sm">ղj-FM8lb<.BSSn`ɝx'306' W:tҡE(Vo{|d''o6y)D>R$N<)A$S8]Y=M||8Z'n5T$$ "a,D (Dڔ`N%.:ͭ1RH9 +$.V;E!2at)mt9kF6-5vCpvZ#ѧ v 03cIINIT:%zevAym9$&#\Khffy )Ref JJIh(JD(Qr%@VX22n _^o'ׄu=2:W%oмhI bg4ց-ݱwXi F6Fp3"'"Ch[%yJ>M :ˀhn<;AjW;ɄXoSC Yymf??fk.BJM˳vܤ b S=s9L^9cy.8K:ߦ*C~(a^uSۻR)WJ`({sN3~yR~6;?,JvcVw!^u.mJ3{T5mE˃KB̐uW L109A[q&$OP:awhtztk"AnNo`6 F?]kNL<=<4%CqvE?/_ۂnHބdP)GywG?mbH"YbwIoE4rxE(UFq?д.*ƭ;G#]qe0U RX6aXѣPnݜӦo8:$xbD Zsa5.<)wJ_S=8H2C)FFLp!EIRgX[nuWGjMDŽfaZ[4 #HP^?{WXJ!O@L[uF> sW۝J%r??A-dkMINQ/ cqo$&14GzȒ]yjr խxV{+*UOFYmCe\ג,nTa9BC`20<1gH)Z}hKԍn}^;Io5aJJu!J% (ƗrUuЀJPf3˵edAgZp0Sv* T-TjU7UT|󓽎| * 5LA0p]/\5lm,/ ڭ>Ʌ'j3`%F2$hN&9E ţ<5atҡ +_Ӆ\qJP{d?dB\I ,El|`Nv\ t8':v;EbkeiE_D(E ;4[.pĈRwRUm 3}Z+aH@!C>'bjə"dI—$q=f.p ~w?1?5MCݬYdv]Ўڎ?f˼.쁩V'w9w7WbrhIqW":g;Ġ?Tu*ьFϟfL0p˽9F2ɖi-ﳙ|d鈙Fd|w|>>1{noncRTǃݐJ?Fi2׍.Ĵ;"MFyŲK/M* ʱY_7`]d ;Tpv`]u=X'\m\g򒔳9JWFk5*$Id\”IK L Kh%F4p6\{4k!fs1}߿çb]we[vGX]իQ䚋Qj&j1sF eN"-*z5WH$ke= %&]NE@N[\BV= Pz 18G\q9vPSlbߎg E HtProy5#@r 6*A(`37rEqI} Xt]8u{w_]#ZwA}"Q)WJAv-$A4s xd$*+ 9R5GR)R*9Yi\! Hd60ڐd50=zB]*HJr>8μ;"A~OZg 9A~\H'gW/ $v4Z b. rO0g]PC뉓ΥNl1WRm4OeWD&:3-d4&: l B$ᙱn7CQ"؍"9((iĉ`=~4j1sZZ4<+QPa\-< 2)98Gz;^3 %U{k"و@Vķjdc0lkRwhiCq{& O>Yݱ/ Bu 4;P|Z cȞcQ yt%,Qdڂ/ac;K*\\^oFnp&(.fU?O@zc@*i-9A` L*jrOOw!\>yMpNoW>{u92Qx]60zR.kKr VтhgB{4EFmMDMjVmʤBN&'Bz!B s`aZ,PXxR,pg=њl摅g?fqY4n/0?MngykIt!DDpQ!Ha V|"^zCn1"mop,HP?@BMM4&Cr%>.%£opZsd֜ˮvZƨj TM2iHn ̣ 6jt#xastlSǜkdR H!h1묅Hn!$0ֺ3-tl͠i vJM뵆@)67OxY. p<}*cLAKx sZ z4**)`ZzKV ++ϝg "j-U YVQ:팕}LVy > 7moOUQRr:(OWdķ^䌮5[ߧqve Rӷ/B).ŎԮGOI`ЛާU\}}ZOO?`Smu3O&_kׯs97)gc-Ҝb.8T!d!89Ԅr:AhDz_㈍&^<-Uًdhz~~տV !/3-RMNVX 5F8~6[XY_)ʍh׭=[P[u{7]T5,fЛN0Xj&D)т.#A*IiI&9P< T?(S߹]~_g'umEjшTe-˖_w<+[ ^[4sPA1/O֯H4'l3n4l<;0=jMo.Ul\he8{-UZ3Ud`_e_˕[rz]pp:mtple+R7.Em7)S(e|.9r#U(ҦcUx 5Qyt cw?3&MfPWdUizfPr-m 8A֠GpU7pUu/pE:yR>\9cUؘUھI:\U)\#z+\8\L`+/ W/eEj!/W/:WpzW/=+XJ* \Ui vHF\}u=+XH \Ui:\U) •#"U?JkpUjWh0י6֯^ʺ)x/ l }*QPahbÜ(q+cQ-v*OB+mhLA{]!H|4!Fn9*xm9;>{q{v`SPa^]Vjcv,![w7/[Fm-1'"[ -z\B(Т  Լ*8ޗEӖ踅ub@*;[ +gLWھ*^&GdJ^[وM*:GhP82,d<,HL ~N=c 0,soT.%[ID$@!8 \p9K!5YV`Xd&Nk{01m1cuwZ/N)|t"$]. *ϕŝeָB5F9;r4 DyI,ihw$n9_6XF-vHڗ_Z r+fnnƺEK &0Cb]$CC$c5<CEt 4.%K{."B.jМءI .lv[ Fdlp6 ީT2JS~hr9kUrp! )@%g% _w Ѝa3q읰l_v ZJ~j}2[}k;>rå=0 ?uBmQJ|%B|_I Kxϭjl1aE㊬\g̏[Tu\ͼ6r~N&{[{r#;7s9_&Z=uu,m㯆POU;cJw[*x)tw >h_eU_B9N}!~ĮWO^_!?׳;qW` #*YZ9>ݓ*;͵LNxMD8˓#krZ;,Уn껟yq ^4yaDZm+Icbec gf4xYC3XI? (}][oɱ+yI0r_ ٜ  pb}Ӕ"{OP(Y#Rˢ1`[ gjz}]20`ljIv(Iig 3ѓw%d EE%{V{6olĹۂoY߀W^gk_#s9)TrW򉘥ug+f7u C9c"9tQ9gtj- @Ü0bT,aTe޷zg~:5MAjCY$e#ҮL_Tt6f]ue&\DBfQ`\dzc`)k]N)T*` %"6nX} Kor/fh˴ :evٹWyhNqh*cs!eқ>^|ڇY itV>CȉmքZLHLʢfcDtBh,cTg!6`BaSZ!%d,A+W}343/רYZ …1b 3!knlgz0M [6 *HҺյ4t HAHp4d (NQG%d+Fh`9 WO9);|etR s!FU&K[ ecdV;W'0=Ñ( $}xij`F"ﵲy3-ۮP:h!jl=&Aϖ5F?^ħt)ەGo$rt$dS01e?x)a⟆-_ۜK摱dI8:٢Pa>WW՜q^MN sɋ̾dr >'UsX~1e,'^ϓ.#}^}&S&7ktfIb:PSBI!PG1@:2OGGy溋c#֭¤I(D> g(⢱>L~Imգ=]z%{2HNNn?b|Q2]ԲNs}|}xNX__~Ƿۯ?oUD=@n߯~֤Zݚﷹ&~iW/m#MPZrytݟ6ӂlH!T5O Kj~nfċK< VX]\/ -_}z(w!~0`(/S{**']ΗF~”c R& 6p3?4vhbb,=3[I(IHNi0Y˓,}:T(Z*YK꒳.T H%jWώl<.ږixr"*gstM6Vؗ( CT Y  PM&^N|#zUVY݀z@Ϩ. B?U\ϥ.Ka$cîxiq~ЭfA٨c$c]QYuE"bx3uN["R8(}q%dvmzJ:42&2 pYH/AO18+%"B69$2 +GjmW_OeƐ:>~< 9F< كܓy{?߽NB^sn~z>`[UOzW|`sh`.|vK.U~Qw4 WZyJG4 * RXN돺58 0)xe (\WbeTf1շp8QAkIXTɬK2t)@щ ivP4AצN |IQ)xwUzs  n̯]˯f3˯i=2nߍw߲O(g}5^vEi11Y";RPbj^Q#}2Kᑝi`u7ԓ>|\|?tM  =nk`}sn緗sΘٚHP rN)R"uIҗ mm*yh2)!Hv~[K>ۻUPϧC!8!{ rD k(/D¡/Lٹi#@Jmi(smP(edl7aN ظ8{l{\kVмq/>F>kA xH'j/kXtEox2;䪯ܺnlJuntQ/vǙ(9pj(]Id-$]Kkʤ+Rz%#6C@²zrHSqA=&}<1'B{) Ngl&ݞv$fƾPQ}Vl.u>߽*ziWOlvz>.W߸(t*B#H#S/`Hh|Q ȴ̉}VlIa'%tJ6h!XH =Aep\DDj챛sC.k7ӎzmha#MXEG `\^>dGZ C65[ 3dFEG}M"ZE1MXTLIuFߺK3q=-)Wx(~#ьq-#)E`Ɩ h NTRf+QgId !֥ -m:U?Zhioi( *\ګO=ݶ<diܾ~lz ,hxX[o݇oA"}X~ع6jVWve8:!Nۣ=hx}d.ڂiHE'g*&t: Cl(C d}Ӭ6TEP@.ZRHٳEYdO:KJFAgmg5u`c9{4Ꚇۍ|)>㓺$*)`>fm8{t IB*$0  hM DmK/7g ='"]VG['e[4`=qɋfssҀk%4"ʵ|P(o`Kw@zNjދFbZ~|-Fcȡ7ƨ%AW|( őO ]Ȑ.Yi[ecTZ9b GitD`"%"GlGD*Di%:nr_ .Vw'o\=-yOݔ"^B뗯}^tk }|!SϬOŹi՟,ldՁsU0nh偯R8}uJ? L 'OvW J ]vEfC6)4IG7^( d]n{lˮ4x+Ǐ'mk 3~iK?9Y[|>F4`ZUI[3=gewZN& #{"27:=ϼ{k;;b j4 .aٻ6+Ua,R݇Nfd$n N#)"=<[iXU~OiFkO^*)9擃Q38FA* j\/ B\?Kb}o韉^II~>GU> m_ڞ NrՠL[_hxӽb{ důc9]M+7G)7XH>bhp%J7bi1gC Ƒ&&V@sQ,0wݲUꮦ^-h$ ӬzA?Ռ):\\w++$V0O0"VBk'P+xy0հ֫ L&v2M]ml6LWv|n¸rǤ営[M3Jܗ~`F'78+1Ľʜd+{f|Cmo*[m3LR ;fj9 턷1e%7Ǟ_9^Xt34*DDI}}U37[W>fQMyQy{xI{/@.ɵE'7siB )oHL4 hXbG7 ph.1E 1طd >2ͺMG~)RHSTIWjuJb铹7TM^~o:QIh{oޛ2uB fHJ䒓yHrt*Q)x^=w{i)*/Sٙ>0z0QY1@ΠjA|\|ஃr  Ƙcx4ųCEzqAWa}񚸃S 4d U5QU+S2:7j00.;+!/bn67~NZ*1Fe9J=gKY$d"iY"M-D]&yᜮ2)np0sHg[ S!HbeP*?(/ؓR+eR1EY[6;˜\^Y#JٳG4;Xy"ȥy#ݨy"ݨG"F4.= i~B tU"SQWZ]]%*[u" NSOF]%r>u򹃣#UWJު(!TRW@0dU"Wwr]]%*60Zur#SzB*ɨ+ W`t**QK^]%*2HZug =`дt~ 7ޝuʃgf-7wP$gpǣ3YԻz9(|IQƃ ָ1û^i/FߟۂO~6N n;|1%bQ  '|T?2zi׌C"!k'L(`R+X%\&_Yi)'X(>cژLS-3R G ~CxKAx'Uq~zhۧ<03V ^B e4:r*!qKC$LihtҠf ap) \.zɛVڙTxGDJ2Τ"0K5N+!ʓHp7JQ< |e 6R9`t0:('TBуEcϰ02܃%-? 4֖۟4m#HHCsj9"{-18i1]0vԬ .Dη@0ϊ⁝?s璾y΅P{rHD¸3\}FtIXXnR$иbMɓbm-yUQ>vɄqvb?"a_IA)y| j)#GV6sVyYSa fqP0vY4a#!M\@SJx-a)< S=-/a* z06tzW&%3R Y6LrRҗkk?ܯq6mn0oxBC=$rjfE2{<)yijaQ.V~}}6J'L!2 ,%& ?}JJ@uam}7T7,EjJQڑ1E!CfH12B̷up:q48$ĄtLK[M*L]}aYۧ.Jv넗TKqd N,4F4$PSRGsVg;)-]T)GkJ?7y:w/Z|,x o))Y-[t8 3Vp[$mQ*3 o`8.YD%-OxE."i/i}שŴM".ƟZget Nl@Yt-3hUܺ}Aϳƛ;6Pyz~HM46<vךfW7tM-1p7ږvkZMYs2Imt/])wrx&B=-ck0oak?*N;#x㙎7kXln5"X|߮bw] f1eif JHQ{G@z$cȲ gtXh4v'XC0#BXYL^jʈhA #(H8n+!p9[#k\6  ;x`/.AU3;ufΏJV=]QX+4 DQSap"i&`"%KhIL_JF1Zle ;6`CT KIwiX#g9N{>;Yj#u꘢ %6|R O&7T+ł 6sغ,f&H%pJ1g,ض&JH"}iy0ŔX2A 8-\A{Q@TpXLT0DJd IJ*P j4!2UXZ,3BRkEl.bҡҍ@t'S[k SYs)$ŝ@ %&0<,JIG 2뜔:HK$$7f.sCA,ZE D1Z{ncZK)$MjxlX JB#XDFpDCJ(26,gڤ> n cgec< KJ0C\pJw1-ƭ4) :+E~fI-O8$*xUF2cQ, << q LVs8^{*eT"Dє3 sX"L$@d=aZzH_!|j h|JiRMRՋYER")*):XXŪ/"#Cx:*_Mk͉s7DEDP$&)xfh#:jPzc >hGrva~ ݼ, Ҝ< QOV CyMTUZo(+6S"UĺIir @ G7'[ܿs ^{{vCx<퀴΄btT*|:N+- \9 ԒzA2^'3 u=X<%`-K֤ dNj U=&bBΙI"))5Z8`:X`֪E+<L^ڟKS8yr/n9l~zgzBڍf/wTeŎ dD'sx_x>??n'Q_^s&0Sy{f;| OdJX 8GeTWY-u&بxv;9 @YYMh"G&&rE.Z%0 4{M6hT%dAP9:* -4yb^wpIz|o÷ ap~ k>?·Zvb{[ j0'YPg,5aY%ib q) :.}7{pEh+.ғfa[_)Aޡ뗙(pcrKI 50Ryo@Bhuf"Gw>XU \#`-ZScvѥi eޯ6@v9R68*N@lP) If##=8›Fs!1nzԄ>n5qv5k6p E'U7{Uj~o[OM)Y¢׷{ I&Wv_~b/VyLN'c&eB $ɈVK<$r]2*3{Z](T,GӓBN1[_)t\:EB8$Fe&~˸VBQ%*϶+1ʎ yX^< č'ľŦegAEŅ$E ˜  z6oe -4bu ʖ*Z l  jP:G ) !e_SKO'qùծ6;Pj`k#QeWN;BK4aPEhU([O4ZFŪ0u!^B!HmML4( #ڨXĉYFЩN^Mq_lvS&7}DZQVrERgJzlI`6b|@RI1- \P:Oֈwl]t~дAzia'Pu݇"}iLA}ߪ;o۝5n6c dI$ S?̅kPX,(̻@ngovnmBfb"!%`KІhAX8;dV[䄽K:Vӥ5m1ʝ5XY3jI%SGedYD+Z3NI2+u3%O*UX&AR FQQ)& 3,19R@0TjVT-':i]e450/;|)ǫ|<'oE'Ky郈cmyBvjuo>!Lf~KSF^ߎ^fÖ z3M/iBAL#; FJ+8jg3 )0xnӸ OݑӫrZxe vY`^lq@R1Imz\+=;;483cr!r9!OSv_O\9{|]kwZNt {ObxNO Y,o }-.Y^ˇ^ˇ_񓿺?2 ~e:{xZ o,M\iq|MZ}biWonE#%+wFѳfQ~"κ k?&iɌ49AMh}r%c/iԧY#{;ai$k6,s>}l=LfkQogӯhg_sWTi{z4BE DʫE ~LXg d $`|(z 9979gADghJa+CPK̆9z@]Z 0(h,2DzI߂6*tvD QJtb_V8 nJ\A̢߮_%/[w+yDfwq39sZ ߨ6:7jCS~ p  ߢUZU!6S"Ub&%rނ+wn2J.p}q"uL-j=P[.d *uEɟ//Uyh<-k@,D++=15폣o 7V/շ_ϓ#\^d{?~ETڈmXh퀴΄,mNPaqZ%PhriUj I 2@-[, B]':t X%5)־Nj U=&bBΙI"))5Z8`:X`VwЫ{Mz%f8{}k~,SbmBV*5cա^ct_Oͻ Lf/VsHv$'pC,FJ̝>;jD߂Zݹ怽\/u`3ɪ<2:#8j4E%r~-[|]6eI,"wUWrΛw'>'-T݄(=1>h#>h#p#>:%sE Ả㸤D^W𵢖USv<'4O6ԃvޒڡޫHwzR I)y\e R[ sG9|l? Qq! ºiҗQ Y#E2^?Fjw&~xmTjt$|l'. fꢗ Hh}'6 tDxv/kdlʊ0P㉋:klT<;@xsFgƳΏ> x4d*! 4{M6pH%dA`( D MX+lG'}v}A = =Nȉ1) &ԂlŴ +Zz)-PZ\ނHЂlAf$B)~wqy6}4w֋OpR ď8Pn9~4U/|c䱻1v;ݸ! h_?gF״X9k²J4IJi ߟȿ=7~̧Nh/<1X[)d![.Hp ]6'jȐQ궩-VyP) If#wo190&G\Ha 0eZSj%~MWf4+TWn@/vɱ ' +n7]dTv\4Nt^r/Ysa\<Ϗ_zUsb)d9٤Lrd<Q'4wɀdSZFet{OK!jhzR(U:fKO49)1&e4R-c5q[(,8ʶP $kz]i6+J޵B% ~1`x=y!Z1E*$ =3CJ4$E$DNOwMO]ꮮ m 'iPhy4|ϬIޫUQ1n ) z lpK AN[fI.qEMpÓipgYmr^C1aU HЦx)rkl?[hbܱv`Wq<ix崳xϵtA "dµ` Ge0ҨXT&7 %R^tQD A1`&(b&'&I@sa1rکW̊q_4b1V#Q׈8^1=UH21 tQk&@rul>e**#i %Din!&F`)}0(9ГfB#,"…5b1rkĥ,\YKՋ^T^^ǫ޺|Xsu]Yu׮/ս.lׁDy/Hm #/g,苣Q}qԾ8j_GG :pZ>j\:knoe%r΢"":0V0_9 a)7NF#T j499 q93^{_Zgzqc`f7/w.ɲPx.տ9cEm$蚷[]!v#ڵ~kYzWFK(0tY/7P-W!F!H## ,Qk$gJ:RT 8$' H1$, sԝ>pAIe-l, ժ`C:g|Uf\\$^0_N0<^B;g^&]қZf'uD=ʤCy͉,dmK,m]jk^Znx]䭎pXn4 IY%5A*4V8g8oµ%Zq{{E9FgktU+b̶LG8wLPj={ ^tmfRf $L6#wb9nx4Ju&:׶諝 MîTw (_;O'N ~rzi-#bHpq$pHT >-d,0!$Bq11ݓea몽>)} 3x=XRU}PC%b@r8G-.w?YRQ :ٽ :GؓBz};Tǁ`KxJAD2nsrD V "iO  SV6!^a !*'P+ !!1@XN /Ҝ Jg*FYQm#/2PMY1b[qu$s~y!J g6wj{pG(.:(4X3CDmu>:"d㼦̆5b"TKp/gGhDa[LDhZpꂵ1JOCdVkhr'Sa9s}z8C]yjD. K!1N %iAQ'rV 嬖߿1Ү`C V$$Cy "IhhL1 & Sڋ@I#gdoz<4z IY2}!DUN aS VDB\LJ{TjZbI34rQK & N $d¡gf#Q/5т􊣈ثͻb6QL!W`@%^+N=#!FD'c{E<`lJqm5.^βa nԝTuE"MjlD/r]Rv2HM^(,q``ztc̩ HU["  }J14vxţ-s)Ofc]d*qI gIrEDJ'Q;DPʉr)/x"7ySЎM\N\&2F"'Np uqs.AE`j4>jKS$. U -|W?99x-wzth|` DiBJ9 x@Kt &% rƭ9$8-^/0!ZB3zHV(B 'Bk4:(P@q%@ ^Xy ƙ"TOº G6Zŕ6X4/"X*Ɉ"?۟4m"HȦJ%G)Q1F-<ЦDBcn ԕzw>#op_XswJ %(6-LdyM.8)$S ]eqzWaf5/̯ԫO %0}ɼ6C>Ԡ;rΧx8t!l^6hٜ@J4QXDx\-lka8nKg>E&Rq1;?L ^;׃sWvpf?.*>F̐=Mz0mq>KJ|#'j[y?u4W8mxw#f^s~hxr,73{kmeVwn B:Fk$!k=f(ƓZG Y̖|2]'zv3lxգkԮgR哬0Γy܄3d88dLl Z뗞o8=h©r?7e©W?dz'' ~Fecw#Y=zow٧v̳Jx(5e_2PrV;,s :sÃ_AEE6WkBM um^{3ɋik :@XV<כOWok=h:^yty2y}ﹸg} {llF-rQ-H=Nyѷ59QWzY5KM``ۃ]wivAsdvlfB@|"ADu1= W'Z*Ü˄님?λYIhFtW_,=:~@m%5TZCi GcBM܃ ڨ<QJO5T]r߶]~ƿ.Gz 4| .g;jp0{7Nreש7OqihUŖ^6y3?*킡n͋O0)[?Ro_JAή Qx9si =J 6픶v'|q$y(U)S>8D%fDҠCEk ):q:٣% !˄̳/M"-7Ϟ;}]-2 *&!M;Scah$e!X`M^x} {ڧ':nn|5jeqstBeo.GxUӱ*pPwj_Qc%"8d+sa"2=Tir;l;cO=`[Z;w%ꓲA1E %Ԁ()""me5=uֹl; L C༷T Z:Q"u;ˑr9$6yo0z#t׋Dfg75fց\ߠ5uuaufvM׭޼Eggn.v+IM'Ѫs!etn]5 ߬ulޫlvnowjz|x}mT=jɊW#o~Gl`В5GgMft_z9yM*tWܴzyw c!=7/雛 }ssHd|A3Nkq6hɬؕ1o|ӕ0W&Zq2'ヌ{Q͔"YCEfDce JY"%OiFB* % i!ĩ\Q[.JYŜ@9P 8-#gC}]Χ߮(Z%c[aC3tƇXˋӦ-VJ<,3yf8Mr4Ĝ#EvƲh 3"5 mRˢ,Eb ,ZGpf85c2(PR #g?9{:8jWy+W*Ȼ]΢ޚ 5<VE + rg#O@>Fj$ZH$NKJZ(oAOR .ŠƉBfQ#ڠ&s!h-3='~U!}#˶VMuWdU73GUzVewշ.ky%"I^,CnPľPnE^i6;m,N ,+|Iz6Eo];a˾wn9osY~]ޠvܝYkO}`R[%]M'7G)7(N@Vs޷/??ƈmyբgloNQM8)ߝdN0;|>\; mgN(X(N&]һZf'mD=Ȥ}YD%t\I;&-4q& qsӜXLS 3F.80 3{胻gq;m`c+L4E h٠&0gwي9flxvi!P^wo''b@.߾?O?2 9-' `R;H;,юGJD5 P90$arsg~ڛ?pyOM%;_](V(\EO*VY8EN@Yj xOuʘ@HSQwV8%3xC-/(18-(C2Z(OHW3U؝-FΎmch\=ҙmn6++pO^nT&H~9Â[uJb˚ٕ-KϬK'b[PcȍM2uյ~|Of|dߐK)gWӳ V_ҺH򲛶ʋK5+/Fۋf|#wsC{:\E]86ln9uۺK7wf*e*⽜닙M⹕EiY\ܺPaV gk1cFÍ|\wGʘ7J+q2' |w|#ݣLXEh,u$ - 6#+SP),*Ќk<#}Y7*:@Av.r7Y]j#[}9PI{eQM2EvƲh 3"5 mRD,Eb ,ZGp Xq6y>O3 ՚1w(T)WX]o@Z<ű+_s\SL]v,ɳduh}c6q6z_1#@x+VK9BoQ #2}<*#KK{_TK4QNKxKPɸMI8t_X*$@1ٗe^ySRK[SRj O.䙒M |I`tzB ( S+a,ML.}M@YNxpUz~OnU$!Re0&9Nv,Rx"7ySЎ\aRk(c$\Qg<T&F4EK(KcŜϏ)Ԭ_ icpF>O2@x-g+ $I%A*;\q[d! ^|*\ތryu6q e\hô:iKBk4:(P@q%@ QPy 0myJ]?Jo ht@(y1ܐQŢhT9/ 0NFDcga tTi:;f#+^THDѶh"HPBB b TvTV5+3dR8-?F{y%D*e(MО_ii\ٖc"0 ? 1!PK(wS>)mŒ{ys?O %0yuq">Vntrx0J('䚜\ É_ޞ'NVcO 䭄NuU^@joqC.Ie>$O8ߝ_yz\N?mժdKfպGF̐˥O.'ݘr~c# ڊ 4!0:[@oػsG{su>bepvCzN̕tg]Rmͤ [{ w+YJ[$+]]_F,ap.fK>\6=^\՟l9eZ67vն{RsU-4/,}:W)=EܴBt`zȎg?~8~쇟>Qfه_~@'܁ <|? ؏.va.k.͍X}vZf뇹n0?!m^ oCK A/AHV_؋Wrdi;fKE+,flA(2]A\Ք,7\ZDmq۸Gwk*f`$schV͛|k~b$k>?5{DH Ńi&#Zv;t-4$09q4A6$4oj(bu}USzGzX+{I#%mRY2$3db.E(:#x!ޣ%V?XS `ֲgɭm܄anRh=EtI PM =R%X8 |K!ȯY<\_Y"&vK1J1K|mmQi@ XoloxV‰<ˍ+o` f5eK7}y89M_}:PŘ󾀴0I$Wx]Tϩs7e \_Iϊni?k/$d>)nǙ֗:Ç(-Ĭ_NQg*t gJi2 J"ѾW/lW/~#oyt2xNw0֊#J/?~jR è}ќ墥 ͔Q+VZy-zu/_|jH|n..6_ۅ~w.C|rM^^~~ƞ_.Ү1-XAߙOdzNqus= ڂ|Xi<+`8OWM'u\Njg a]֎9¨[,純dFK 8/l[K{'dN2|J\ystK<(YS (=Q u)%IUDKu&\%gVXͬ$ywx7x Cb'|T vPU) IDާ L V,DϵXLA1 ,7NiAhrtN`99[J8^W3eT}^TÄ>{co}HDo<^*6_`zmUqg|rٻ;Ue2[98;dP2C09d1앖HES"%U3ZtQ Ŭ`C&\|T,k?{GAJ[3Fݚv$7θ.=Bׅ Wgŋ =mVՁf46O~wAFɇ+}a-꒣Vi#@DJ luL& :-"PB.< Y ҪMm]-dv2OmITDrBn_ܭd8{jmYk^km8:Yd|3!9  05X;kվw5+ g ܫ>,|*hCfd(ɱ$U0YT29,;Bm{#n}: tXK]ш}5Y#Fk[⪐@ chɕKрN(kJ(>`m~#K_XuvJ31DŽ@t0%8%RWгF썜5WY/ΚpR7.^t=E׋{x+c{HJ_)"IBMP̶kU ^Ȝ/N cŝч}qǮ{vqzJ"۰gb6Fn@p6U[ޞݥH[d&{8 _%jwE1l:Hg:ԂV5RQF "6eqX@ZÒϠۑ}uh=!I*9(یOA"#N"(@ZC r,^B8W 4&PL*ZbJIʳ? jM:s{#g˄:c5m-Ab3[E _\h ]eͧA<"KV(>Ul Q1b2J{EwFoO}@_,|8ɸI9*JD\u>|T.ޖFA>|sS;Q.l)e%`xٳ`+vЛ€T;FhJJձ></`bmе܄2ڼ/ hlZ)@ 5Y ;uA@ȶ(}=%(Z TRl6NoV~ -yɎ2M |$cV(Ⱥ$ UIȹ[⬯@ɼlX|kiJv'M^|iJ!Jm^:4}QD|Nu[FIHBR/\t}M )e[EҚ|ӅJ9EEuZO}C٢( ށ$GZ!ۗa5+Cv&wZ|6eA-O?d;φ|%uH5xm`{^aΧĿ7wro,r_s= rѧlc _ KL4ZF%uC# LZٽx'+0SzJ,-Agհ0\8/iKe`g $j2!alD&hXElLaiņD cS HٴO@y&urC]JF3lHA,ݐh+2aP Ѩ$mLڠ2@$Ym tdȺF Lx=fe^bFΖ-V|U zrᜍgbbLlc|*9UV KQdVP]»jt1$:h|V7PU)آmNYRqp$W-V=1b@V*;wϕǖqiL]d>NF^m?&ft6:(9@>y6/Q*=|%3H" a0-ܳbLV)b$uW](ɒ{뵋؉O ](,lY RtاRlFG!X9gNHv!]J* 1(Oe̖K 2`W_b.Zm'|8]E9;sug[|0s0R).[YDbHEUfg!ѿQ_ݿ]gSa -y?~BWjyZdظ@"5<%LH0i]OhŹodXSaI$@ӲqIL'{9'd-G4W򥴿f21}hcubpZU%W˿ku˗ӳëY1%n>H\.0l+Nj۩%jkOz]^Ìg;=dqUp 1(:"4|hx|ͬ3{%PČ6*0^MƯ@ #y{H`֏tiy*oY 3'X-d_F _~hT} -72q Ҵ ?~7gX. !u҆8 qWrf{+ iqbU}"paܿIq4*km+GEfG@fbAGlXܒb'YÖc=lS\ [-K"A<lZHeE%npఁO(Y\{]T,OJ6"ZkG_PTNH#LKGqAW<|{ 8&˪~Zߟgv)yėouwyF㢳pr 0GBnu9+ۇ.~P>S?2 !!]{h.kq6OGt?yX~Y.ӵ: an{0ŕ?,_`32M+`𷏿;7@= 7Ezil6Re'mza l.<$X%A‰ؠАST!H̜ 儉ie"SO7]oH19ѡxs}͵H׸[E=M_-)4`}L) 7̇TX] %E `{mhHo^IWv ^Sᵽ{T15N^d3bZpoƴhNüdY]ЖGl~9%S4.]28 = YZb6u)czM^ $]J5qC cYO2Rυ{gUI=jõ~qof-1h5[V6Gt flлcK{BTzyɷF :mbz8HVTuCqs`O|q'|'|q'|q'|82ZslE*QO;" 76%"|=ݹ*č;u;{Ibn]'0s2{|Cd«ȃDa@D d5%h'c@҂.Sd% F198K pߘt63gO=L}s[+(NģbBX=(9OV@nۈ].&𤢾6?+qF|}uNK9]Eq_`} ZVk9J}ן0疫VCW:eaH0' #}`i(Y zäsu{. !~{9Z5sdءBEAk-$Vje t"DWi }9wJ7B*CT>ċTX YT%`x\,_i Eiݨ9(tJa mlֆ)̫=n9,:}~`WmI^x|'S{O %oN}h:piÚ '7,_&d '#K?KKK+;f'K5^9Ax\˓ p'N*Z :6al B$YږSVWͷ&~=eX-dX|.u[:) QLF iz°4_}2}RZQ*QD,2QLf`e^KS=I#gSf>8z'QuG`ܩڃ|Z Oq'Ly3|:a<|=t9 1@sPB)xV 2uA1G~u 8s>ESo2^rhAQ|̺8E?kQJ9H"ƍJ+ȱCZbco9h4֕ݾ-TcSTcH̳Uc']LJ ~U.MQyw.E`C)qh!SՁʐ do RDnkHCì(! zrm*pr@9*!̜ȸ6OB Uµ/u5pC"=FOMl8X7Ë㟞=2bWZ1HAs""BAHaC0P$蠂%CQF;#NP$WR h $Kr#>>7%£uwFZq9Deu08BYywpڙL [% /:l kR,]4D R5'x$:a3sva/SU`<Dl?1"{D ̋#3LFUAEQ19PxIPO{ÒH.  fFJd8 lVe#y,Fm9{=pq7tJkzSq6Eb[{AϔQ P“BT :c2%#c\}hV'zǩ&e|,6U\/|!9$8i"#Ӭ,RYBNˆ@:5C|lbspspsps(1Q#tVFGNIbr:,|LRrV#t`/l=sp!ܺcwёXSݏݘ}rl{Csl;sr ##;2VNxg0 㯫è)<݈( ƣ2s&?R6Lb>n'&WvCCYo$ߺ46uP_ˋ+mBa}]qF*j玪WgeЮ-%d52%V^Bς(}%De~N5B==v= z<:#:y"xIT2G(f,ij ]샗H:F?d<:)=3htVl?N$ Ƣ@*qO. %9 / P}d@-6!O%^ϖcx@}rSq]]^͆ӳ浯:wE_8}R u\(uh H=pYd"9&JyA8NJpD}yэ{`AKm.s|0:@J撋cVPWэ5iZoF2HLIz֍wᓵJjJ$;;+D53g]6L#@9ۜg)1xGZ݆d;jhl]elizC{FlydcƁ'C;8?dRpXcRV 7 y44.쁧 'CF!8Bcx+rjxDJ9)GQժ$#=4AJC~nF 1c` c4&2GoS@r@hP#4X+QQ~J^Ƞ:0f-C%Cz)j]n967_ju/ X]`q̳7)"ÔӜK[1P8HԮ(T%kCW>xgV0~o\b=%+w'5_0tm'2MSu-&ȻB--&L~`{Tl4Qzõt&׋0+X_.r*-"4vDYrATBk# *!JɄ5^"Ssm{؄'NOl_S}F{ 6(~9 2X+a#:VkrɑxnEC=U.Š]]oG+_vؑ ɮ,8}YCOKRyVϐ%"E5HtMOWթSOhj..ϻBؖNwB]0 JJрUzQ̣uY>^8'~/NK2^lUx՜?n2TG?ӳN#|qFjXrvWz1Wrcu\jG/{dH`-ޘ".'HktUxo^(f\"q5bHizt͕b`\ 썹*sEr˻nHJzt͕h#sE[{c"iu])K4WBXs\G/.(xvv>?~.UMW ^~>g[lR,+&JtZ@fq-&ݔ)#߹ GG"GQSQkkOhLYº13" ,X)wKeyb͜4>gX]Y37`BUb1V Pb*H„լ)57V,o6bo<^af[+xeJ 8 L8ɢu"0̸y@uAf|y RB( d R'h0>ZTYl`klHv7_]/kԶ7Hr*r4 VQV\ڬS_I^Xn\]_r#RL:UZcB4,I#BM#HB/u E̼;̼f&·$laq ʂuʩ H@rnI##EHFVȝX/(Z(5Cy:k-tr:,`i b@r/@!{dbx7+H*ӑ<,C_O6Q$x#Gd4Jfr0K_gZOf8yһsWviRA # vN2 DƽDha {;sѿ{.ldaGJE.g_ꝿ8-9Ok IE&..0^t7=x?X&dV8Y &e[Ppp0'%;8gp&tW 2i@͑>B]`,ym`ŵ\f 9 GGeQ;ȹ8iGbE?MthfZ6%,H^Ȥ&'р/14z~c%cr!i|t }Oqp㎦gK1Yoy_x{yq<"8# 0΅9uq3gvabG|jiF1b$|HJ/t0b0+,.1jE3Xfxj6ѓ1ONO.fz騂?&W^p"RfRִ|v6]]NzA;:|pTݱbtQ]КgsQsxNΏi9?}ߗ_p}xΟh.R pJ& ͇4lh w"04?ގw'PjfUuY)u!%R(+$9-dX>!泘.0+|+q-gNcmBv0`վp o981M,0/L%+!z%悕9%0Z+c\r%SdΒMԪqxuj][@ ~&e 6,5=@H/k$BaJ) #-HV^tע5 ŭ;Z\_m;f~)M%g)&cq)s۴IG#ju[{[6봐6%iiyy즶mĽs +Y'iN'Wt\ ŭ"]^cZ`9y|e:$DYs0!l;;Eo߂^V /t/|\@7"%o'1\I[onVi! vZ_qcbr%)Jyvv}Jy(V2`00B6:CB Y\ܡscNJ{[Ň6%YɆxťy\!6s#w;$7 2>lM eܥA] _MC;7.j}ĦsdV7vyM icm[}͂ nvͥ^xK_ל\:k[^њE g/8k$Xή[l#YgÕyڹ]}{'Juv3_㳳{7;to)", GNBňz.=9٩Юճi֯<6@SPBȭ͟+6V6Ba#$q}a#,j:aral/wid[.4Oּ3@6O)ƒRe lF$A[Vcѭƨghрcż,")`u Y*uiVM[B Or%!(a*e䲌&sF8%8Df2Ö[gíƱw"WշHS!v}Yln:Xˋ朅m pd%P{* ; dz΅HhPDR0mE䣭Զ$Z9MB .IUtp0} 'G:Hr`e nMg}z\vկ~뾍1E:7ži#(y%yX\pl=d*))KB*%J%/.UG"meys!q[e(aZHY9 _D@NfI$<.M6Pi)DJ W }1snV^9c%W/ [g=D,6K$+ c<[q6QUH.NAJyMz0Qx"KʧH:= g|(O[%lb^XeinEf\Q9 !Ӥl^{ `( PQ{3(h4ӹe=s,;i8hJ04I#S`O9-Yklgc#Yt+}JEn0 cd$VZ2{ iLވ [2˞^T;aO v|e4BUγ J UCI)y_,xZAp1>NEYPLݖ&JegǤhradeVpb8E֚FH@WCEюyBeR,sDB!1# 5%GW^! }t=DU>۲ٰ׫Q]fLyk$Q,vN ӫG{1c.hGk+d[퀖^&8%锪>e^( k@dgJ6~mZCEGDF6FY̬RJ94B;r B)W0|K[K}=Z^]O}+o96f{H뙐Zѐ)lf(=|T5ehTJc)~6Z@]:'*q+V%B 6A)uwa=jkDQFHA3ELr0cRG>*tqԂhn'xҸ5qֿ4.ln|Ep?ҵbm`JђDo-oT"{[ߛ ǫτW\& 8"BYԛEJ= ޴RKMʜsIdӸRÑ>n:jJH;/*7y}kTIS&5O֛4h hֺdZ3rEt4RY8&"OLf 3}[ '\H]RdYkoE$QV(`{@Q^['$Y [g}8y u$+o&27qO #ph?B?]+q˝`=SuGuBI$U0md dSqR |F#5h/٠ߙŒM qBR0lNsM EVf6qI'א=12mr'!(P*oG4)0L>&.Q ӌbv6fmX{Px!RDq*kv+a""h"Tq%s1BIB˲%i^7%QR@/kB$^ȴ(BģS-vq6Xɸ+Xl|<EaF#hJjf"I5ZF\Ph;Ĥ1DR\D>qA"[$Fy5D%Uvn":bi߮Z xޘjPBpzux5er&s>=!dAU @$SBR!  z9 a:y:d׈,袭=dMMmO<˗|ZޭBܭ憆k1Z^CsShuUGΎɌ7d$xѿh9Sq5Fnd8;}?8^n^Φ˳Q# )~YNt@% JEʇ\YReVN- ģn>H>Zg>F{!:gV7h.&r(>haH*RkM'6Dk r0w|2n`̺?W +g ԙ8pYrT9/T{~BKk*B.,|Oîqp)3Kg.U/\FkHם9Dܙd̜~tǥRBQjGkQn=Q=]-t czDWX ]eR2ZyB ]1.a=+,?*Õ/tjuBȁ ]qn= xmtp}A*0+*]e?th5:]ekӗ]i  *zCW.}UF T[+SFVzCWn*uQSo W+J3؟A7*AvD.z;t2`g0U/tњ;0wcdڍ$@To J&BWViuBfw6JRnb|Gs; n6 $Dq,>Nbkr۳tm̲?k5ݔZ./>CJu E#wx_7~7({46J9T֑kmލ_Ǘ&{5~fRس/OL"U` ʙl }z) ={؞mcN6A ꝫNBekͬ{qa_|zWo ^]UiaUKtmk{g)VBS9 fTԛL=/owZC4_ JtH<GC]OM#B\U/thUFz=tŶǔsv=Npyyk7Z bݢ+]8ҋYvVlr.l%tR:?mKt_,X**㌮@xZ * 0N{\K wqq }=wóey`O/^X%'}E NFk ?[uɚ ZF.Σ-àf[{30EI">cm:7-Fa)iE ~?lgN=exxo߇5{|MPU+ox4Oer2jmr m?xQG!9|H>$Cr$l໨# Lhn_1/8Ч[0x{|VC U ZƋNob}8]MOtc FWZ9m/^hwP^' H@󂙰J帠FACd5:T ]c=0"ALjM<(ÁDVHu mm"..g Y_]Φ?nRX:>Es6M_F~DXiB0"RZ QXБ q"3q`aq p:y ^܁-LvDxjnCM[gr8+'h%rDI '@(!4lsEIhмH  Lg0PO+ ژ%o\NQ$orJ0MsF(2'x'B$-J$52Sy\"$ j%#3GǬ\t) z,8 NW\ 6TC|\~Xѳ}mEn?J,(*ٍa~;,Jx6ϗ|+}&4^7Hz\~xqm_}!fxu"r`̹F+jB:Ҵ?ZX4mGu5˱"Bu=>mBoq T=ƽ>>k}(\j 5?F{onC˜`X3LR8wytff6"md<&o'8 N)*<Wkق杶%[hwc#2:!v͡W90nBgOf@ʭsq&{8iP,UQ*bRN+4!&˝K(V*ȼ-l)Qe:[tzɶK ""{l%,7%Yu xA8%RC:sX:˧ kܷC7KŤ܅ˋYR6N;/^<!h=B?Km,lrLh%SIvڸJp[RZ ! y_"XP)eՁ$Kp3r6Kvw 秧ekoV/q L5b@,>S kPIo_C%үUZzJwCwدA9Z#GN)>ŕ.ɘ: bAw^4W;r1EvXZV{#[=38Z|O*D KFi_ "J&2s2F$¦V蒭5AR6Ԛ%Z$GJ9EEuZMm3r6#z >MX˨0+ m%t:)nK_+ͳOYBKR\LUq6 iT(h"eYzbL! ɢYfSrΓm&VrvPJ f6CB2$9(SZ;g+MLi,gAk*k D`yNR#GX…)bLHZY3r^}b6f\è59dizRjorXD]C$A,d.9d^4wx-W|etV c!PS /1QV;S+]8iI XwY;W|0DCOJ I'NA`d泰l^:@ Gcbܛcboy~=ߓLVf Td iDIbE R)YCPwV #´]SWAOvEc5y-ãO( (CD$謖Q n(rD caujWyV/QG,#|%3H" a05Ur5Nj*Dd֢&"I:[ʼn,I^ȏA|]XRFaea͢ HmQh@֦"lΙ'@$ˆv!]J*ld!cP PC]QzJe)Aj|'ƅ>}c->Ƚx9:γ2N}b| <}inԼ)ؗpB퟉CWt 'tٵȚ-<ڿM$@h\,댽|2J G4kKi;? +{aoL_{_; Q2%?ןE>\,9VȊ)~f/8s,n~d]q*?_R=߀wfëu^샏bP(v>9 xtx4YwvQČ6ÏGacXrJ`_d2j2W.*fg @ȧFO.ݜŕ*~MZuսJZD&%/TUG@˽i;OfݤyF4:;bvӯӏ_~~T}O?%V HO'O#?;Kǟ_Uywi{7J M3֟~qhj֫kmH} sC%RqMLsG4 ªbӦĊuz{]'!̀F2mn0[9Pr*bREK%kr)s̅L9/Rj D=eSoO;Qf6? ,(y ~*Pis^Mȥ{!Uba0J1o(Mmf`35tgǥAlSEHuE3%|WP? mv: s6$=f[^Fh9]F2LbF/< dё()EuN M1fb3)1VGw]z-^@WyG4޻qU~?Som6orwokU~l92>:^aTs"GUm NTzܸ8[0Ԓ)6}P+L{ho?mJL*-}>C9o(.j:+ryS)һGE?f'U)|7(ȐԞzQB(! [wKQ \GViB mտNk|&P0qeB 'λd-[W^>yRyVk;`YUQ&Er5~E:tR@>Rxg `v7;U\'w}|ڈsnX7=o^ @mվ1_>שm`4*>;]dLY\Lph&S(2UVur@)$Ւ-"IL#($01-v5EASK-]dJ ^39bfl.3=F"k<CբnjL;YOOm=w.ڦx51چ35a~ӳ~ΕX˶D%mrP'ѝj'LMxN_ixtǣv'&]\rnȎGv3n~u{g'5b楑\={ކmGzh D|OTko>t"u7Ǽ$lfX _t7.!6ڡp.CVdZԧтEtZA!:@CI1\2[o7} ^Cl7|2oo,kYkYZjcWi]]ؕZR6֨$g|c¾hnKP/H4MI($ڼ+ʨ@EG8 $e7[,8㑒6)Ea/IG!t)@ i.ym]P<wZLݯf 2]Am[F[h2 ]{S?ti٤ЍϏO50"х.%'w(@uE Bi<`=Cx۹@=ܡ@=܉@= J:6YW59+e%mE"A>%(%bbQYS EFM>f`rLm`Q"Ɂz7;w)H`-ϭbGKaƋMĕ:5zCU~*-BUPԘ]6b{uUV>$#VUxEi Rwv~UQW֨Y6AtEv9,T1tU{#]܎Xx]tӐޤju*2SШH*J]V9/TŸ S3 $Hbb(&YX6B(76:evȭ3kܲ>йA}vk4 stEDNJ5- H,Y LBRĈ`f8`ܩxW}}B+,늯Xz%Х=c]TŤ Ǽ/t5ZٙN!k:Qgh?/xJ}dK+MA1FWUB+D*^ k>-tAgIԦyJk(ưv@ Db1YǔnԌSh=ҸXyKko0gmv&Yٻr$+j,e6CY_R1#ZQZUHR&Hʔ`:K23px&g(:}D~.|#3Ia:gכ"aDXpzgs=d鲳PQ;2!Rվ!Vq=ټFahk=-Xq?uaUǡd]ft5Uc6!igZObSX zBʞr7'_o受k_Ì} nH>&b~# ~"3k֓N}wa__}1Jؒ{םvVwv3[b?hpp3:CmH66:rq Q%b5,ѻ*z:g^~x졠 uV>.z5͓m7WrQK2l.FF3Vz,{ e,u}PjkV5L^yx(-!쫘'W +}սw^ڧ>(SQa䨖 )q&ֵ"qE]j2(VE"h:!Xf[iezz" eJ 502*ӰpZf< d, (~T`']}/,Wۃ]:=={ze_͸)/U9kcar@g[>N݁D',RRbGw0 3>g݀M/ZPUp(v&OQlM0QǬpF|~VoICAi\ԦɨM /CDmdIb 9{TH x8ý fRn1x 2lҎ | TtM5"PUG8s5iJpZ8^Կ;o x."dD䅈 xqf+L1isUp۝Hk=uk 1 ?԰i}k(A6|W%K&MJv%19OȬpqˆwҽiY\\ɸ( .ޛajEDq7, p*=p:SS . ʎCgZM\!@r4p5=sGϮe$8cĘ#ZǺ(y1t̻7iqj>YY uaNj߾8?k3y}p̛X\/g:𚿽k|wݘ<ɐ4f΋ף}>r)U[̑"\t,vDK|vD)ISm#dCYpcppպo WO#zZ /zԓe Jې,oONOVlB+k{([nf_K>4%;6ڛ͇n6Î/qqtwyu~q{תּW WKpd#C,'z:?7>=ⓃwчyA[->Z-SSfцHd~{~q}5/ҿfͩ7qGUjѵ{tmb]H>Dy|\1,0'L@1V6F-7?cKB%y8f%qa$[dG_-MnvZϛ#~v&>݉?]2gNj]1vZY71wU] 2d؃DNR9ۛڟf.L64ojj"MͱcM9WWwV-u5URyDp`[v=K ԽZI[7 a"E*E -\&[͖ՂkNqpָ@23C|ի7gT[;s'v6p>Iqf)5czXH-tO=" c1c;4E6cv.16>{vTb/9%#ZU>#2"P$Qv+qɶ*쌱Zŵ `zƓ&W%gsh+1@*!0XUS=h/k"\p1~gB#=B06jdc}Fp4 گi*K6AC*^*:fdyJƜXB458j!MM$>77CcUNj*ue9PJJmt\6'gl ʟC0K1)J"wMS -rI!a'-Ѳ +'LH/&mE;35 P %E$ڙi]AȐX(#&9j 5Jb=Y~j],TM!9Ӎ4):ԽXM1fdT75.#k%ġ:d{jkm.5ێ03IQ/3~ F*`DR 4<ݫ 6p p-GN|fEuj0QJ[e<*K:2!ޖN 5!QJ  qOY /V9;]}1`8ӔXOs78+5d 2r1ِR  \DW3XBPPk@ofteL᫒2ń+(_XU]#PH/eJޢ4زI `hw$1jP]tKj>64d$M&2ЙAiP.}t7ѾŌ4qǬSDq|C1&@Tt4d-0!bEN@c'ȝ~0]9}s9ɿk0@wZ=+^jWuk4N@-$L>%&:p6%`s)HYAJCRe1'&v_45XH茼!ȃVsHV 2֐0`I1Gsq`i/! &e@YM[_P܎ 6 өLt;ӠKp-"W4r @6Z ABHܩ`]1~v?/OO[di&\l詌]XkxhDp(c P.yWЋ:"h-ҘCPF}]J0.B,9;Jp-QS%K2fpNJ@A-H/#+P(\fݭ`iy*g[ߑV{&tH" xd͂7`QmRg*HD3nf)12@!~ЃRA*8@C ת TB$YY& J^NX=[k`C:ud1mpd(!mT+u[5 #T,e$ԿrVE n W|ގ*A`kMSiFɍh-W-%*`A9<sE5* r#b*9CD(QJj *Xh!H HČ*=<rPsO 7 10P|_4aR""j)-A'7 dSw`7ydyŴah@)' YT1"1t#PWU,k~8_1ٕJhZَ.qnثx_rm6@24ԪV5g0kISȁLq{ZE>'#+ޢϽz,q!4qe@PWǨuH5Ci1 SXF\,'=`er㹐??n D*T0jBzPrm⊙$u @.*25gSMx`U] QAd`Og \- !Jk@s*?!u D9{ӛ^c#g^|~~uAЪwwo; Ö}/~xG`E4|Dm_tD}SHƆƃw` #:>w9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9h9uEǑ x:'%>'='T^N d^NZNZNZNZNZNZNZNgZ{8_!e7$ n;X`o ‡eyV _DmvWTuӕ@ @ @ @ @ @ @ @ @ @ @ @ @ @ @ @,! bJ2R@IZ 23E&ILLLLLLLLLLLLLLLLLLLLLLLo t?L .eJ5Mjr}Pۅ'u}ύf$NK*a,^qJ_ qhX掞D4_adWL\5|ɬ.c'L.a?6IRKX |HN͸G~*Fb9so"ORs m*w^oɈr?2GZj'E{[_F|wXb@II@:"rvϰA]fo W7S@p(b@9,qwgD fwL }lU]Xn.Cٛ}0n!&Z໿G_??\}i;\? KE\׳վjE8]٤T%ѝ's~cf,myw6 gW]~hBZXVNVbR͔2p87'p<&(VzXlDx B?Z{(ǟ@9f @Rb [g((u8Z:;@kFvX\[xP`!EjJ2bKۅX,h5F1b"d_D8]e7"g-`M`ހ)V;݃{ >٬_+Bq3\ E5ǥ̽\6z}̻B@xv*GQ9a7]aJ1qNtHIx;1} ;uf4EjzvˋAWN?g Ůy:moɿJ^ *7K^̫$-=WIJoU.qy8j6GA./KA*PO 惁8҂I^Z EDbh9 Do m~\9cJ%K(g"Q$^@JoF`bpr m>ƭ]֡;ۙ8ʏqi|@8j:^m`/-4O^nw/}6dZϵ{47WIX?fremvG.ӣi&Z6rhI$m{өjr b_K$Cwpw5twQ߽yr=cÁ>Upgwݍ7-ܹ]ƣіW;j~[̳i|Gހ] P=(Sdj·?u{Kǣ%7u=Oea6 SuPqD49Bb"N1^tUߵs;fOܝeÖW aȭfuHU-){ܸKڵ8q 7I;`WB%DLYZ# JHB(PE}yHP r1dY*N҉Ј3ypZE{}w[n(c#aW-Zf6YOlIH:= Q~Ty$ fD'ء7đt12%Ztj]ݡw&n ވ;N{l ^}eq٧]v"{YRL޳ק-uC}F 2a0ABS.(Ŝ`}G>=$>* $H~~Po(쩔Q Dpє3 KwSI xf#R"P^8:Y8&a:~zӦ㴞D'BNɾw uu,W)Z{ލX'O( P䕑RBy1 $hBsR|j* 3/ \ufP3AO:SCOvl|1=C+|*٣0lh269bV -A)B)ͣ;XC3 9:T֡ l6Vg$nY0yJuikJJ$R ?mSi*-HxHĐc1#l aHQx|sp_3ȑ]?.b.雧<F`` I5OgaTb֒c}H7U ߝ(̤g~e~>E@_=4!ģq'SlNdX;S`NE#;5I0EsmSf] P& -,MnWrM 2p%8Ozʪbv~Еᦊ?޵f`f_״WEgguB†L@ҧny&5_ l^q [H/T]PכֿWߞŒ*\{}*8'BDI|q8g̮@_gӪJқKS.vW 4{b|sOWmݐnnfUUނP5 f0bGv>ѓetj할kX1ji"Ī s k-8lW쥈xި[ŠQŷ5*@'\_:/oxO?߾>~:sL~sװ f`\*>&x?Lu7ߵ"]u ͻ:*q b_g^ĥRׂېf8H $-_W@rP8&6O@t,5 w-|(@mޯr?{H);JX#Fk""i A`'$1ŌJX)Y8%Īd+(Z:a Gu8RPʍ"VFudXDcZZ H: ]_I&sRƙ}8 LP%\b>Ԉ~_͆p_ X3,\9hBgvIdQ^"3" ̃bwkKnԐYԔZ4JjUvVTU'JYd9Yd' ʴd\YX8N7 :tQʌ1^fkErHz*&QZ(uZ%D۶}Et9R}Nd&"%E꒤/3u&'1lf΁;!vڞ?aO)} P8Q.\dmDb8pDЗB&\´ASUGZ^l ib^13xFqL b̜:Zݜi)p9Ah'ǔ>`%O.}J;]ozي-z ȷ.we_m{d?M2Q!(]Id-Œ2gYi"%,|DM̪4COuSqAzLTcNlRA5263g?2*Ͱg1 Owk[~xK>3r fdyrόآ~*B#H#Y-D{c0'bE[Im:HP=!(*ljB91sVG+s#vMF9nǢ64FmQ{dΑf@ dȦxXx޺И/qPGQgdIYDh8QbTg 63g?A}յ_ P1"GD[V, 1pD6c ytEYsEy#mMRaa3KWx5ꄠJřB$6[:SI+@%bmʜ"n툋#.xeET"LI%8Mo qU1Јa+  v>fW;>|rܝJH_*ѡُ>^0:v+M;*<]g_/g\W'BLR4McW!x-oTtr&ֲnSθ5lQxqƐ=([U(C qoʂOւ EK ){G'%%apYp3sLZ|?\+5|׏OŃ:m |2bLKwRZs18£H 9Ⰺ_$0  ,Ah/Wzc ڃ=]/t ªCnz31]ߝ坿_g݆>?Χe{ bgҗt'jdopf_|4i0ty{_`M7qu5Ii6asȫ0KWW^'c?}?=:+,U2ޝ׳}~)=rH}ed]w0|B!:)*VcyT' ?;:޻>1'zR$+k_!STB.T] (/3+-t!C(D*! Dn-3ROt nw]B/ZJJLL>}cJ3^9 P_tB}16-d)43;:aw2u{Dh<7gqAFkQyS$* k(+00#8. >KR=!BےSzas}rӧY>3jni벹gtUy!N8һw +a\ &jzWsz-=Z~ qR 0pbc_U0ew|YLj!dJ^DZc7P]$29PG!*),d.NNڇ#"1I 'o1h#-ĨpJ%dDtEp0RP'FS"xd(>&aKzjv3.9@;ruѡeƒUzD$!q&CLzH@N/g5pn3uE_|{v6]]\/?1zMZ jnAQ{d<&x3VP_܄VJ5\' 'nib#xFZ0~ 9kݾJVt[R{eUr AKcJrJz_PLPRTYRAFӋe/ VT7<ÖlU VM24}.9 JEk^њs(҅4UhoPҭ|ؾ!r =c+n P Vb /$`6v1{{=coc\coc Nt!{a{/ c{mPtsꓦ0&|:?Pv',#*^?j\JR\R YE?kK{(2K %g'UHQH+23LI:"FbdMX #twO$v/*0|+|_.^OԐn֫nߒb 4B)j`b=~WUd9Yd' ʴd\YUVN7 :tQʌ1^fkERh2xG:EeMY's(9:Cp 3!)9,R$} ٯ3bP1lf΁;!{H4l m$CpCA((|.Id 2|!@89;0mg?UZ^l iS^13xFqL b~ٿPs_͙2S?f(y{5r~Ӌ<% Ƿ{}rrop?S_u-%^-(MQy%F肷MJ%l)䔖HI׈DET/J때mk/-aӇ x~ 0}R1H BMT x̉U|ajُJ3,lbnz“:,?Vfrkub>f]gLf'آ~*B#tр,`Hh|QL11P6`'%tJ b,$dc;A̧ep\DDjʜfӴbyDZ QF.gbQYDKF# `&HqIFkD>dG-!ayBc23C EG[Eđ1*UsR1ID3zV8P1"GDP, 1pD6c ytEYs6 k=T-'maI!,,]ը*gB l%L1r$H*Dhُ/2?ҷf^r,.bc\#.If~ (SDe 5o qU1Јa+ YۣҮߑ{mWJH_(Qj{Φ/nqvu5P WUR~>\A݉u5=Bwh[L;L/VklsUֵzEA~ qI\MRInߙxEc\Bkbgҗt[INMJNG{GVQF]냟}S0{|ysgi}ݶ o1񍇿 /_(ȁ@'B&I&j,x<<<Gڨ8~fDCJx>KR=!B[#a)!8l&!I">(W=3|"Ge]]U]U]+n %HN`SFEL1Vz#0o=<Ŷ-rvhp8a@-9);(0uӣrOW@9n]\B^Qn#p=-C2P4:0h!u|w&1#uܮû0&ŧ ->`׏m6MB~ڷ̠#jSsIGbʥ$$+MCM۽#gQݱ GTbGyo76}.i\NK \Z "8.wJ9Xu Q-ˬAD-: !rƝb(hH;FX"w$ (pXp^ns JµSnxdu0W"m3 ω!2*-2 p'ґsM=Q+h WLDX)vh<=iT=4Od2)Ey-!{+5C3'/u| %OQQ_֏Iّ})Eja8dchQ2~; mْԆ\7 |ЛLt5.-g$g<)DIHp Nч7bicFLjH&q9D RJF77(i>&o4]kp0i4\791%beiNN 念FxTGz⯭tܟqN[z^cy ebcBFHM{ɂs7UIa̔㣖\;3궅,sp5.Ɛ1.chKշ?0ڻR#v/YYC% w|*Mp ϙw{cfom?~59#ewMeSߚw'u۔Z:fx ~㱼A ʙYϷ[&0] 0Zk/0=8څG1-G)_F6jwn ᪎_*EFuųXl+B,a쇵pԗ~|C{x2ji)_y˽|]Q[=80R7)g_[1'JQwTjozH]?,c1c TEch@} smr8G.aO:bĸL=*)mA?-w[k&gR)(,\ Ɛ+Jrc!Byڭ$EAe eA!D*yesX4a)yRxT횣ݺ:XrD@>/7d}fB5jIz^*Nh6MxLJ yfZO9{/7w7EG;/QN3CDmu>:"d㼦̆wۚZ{YC"Qؼ$ N]>FiH3j4nԲ9׌3esJ 5YiRN4N %iAQ'5rvRlmx׏ K* L)neLB2`!?h0՞I =dtC#:%iI.`)T"PMrj&zf6&ZNq8* 5' V$$e-"")2(iy )Xe"I˸ AqB̆FgzS'mEDjГcXd_- bƆ(Mlr3| ΃*ZqA͐ 1":7x' cF7"qȾaZ5Agwٯ">& =Q㵐FB^Q :1X4ȥ:mh+,RsK ӹHH0]7T82XE,Z!GA}d(d0VQzM@FM-JY,n/"X*Ɉ;Ir3 ?ִSRs7׮T%,̾VW꺿QO̐o#]78rbc}{9n!0>[R%X{uQ5aF \=yS>v6=?xj}F!T5Jba|8f̮ц?̦WnK4d)mI#]5 ÚYew(RG V\ /o=Y9Lz9eGeQ/4jӻRRD&,%,+hr׳ jTxbT]LaFթBl0ap}_޼oyswgWݛoq n 'M$I0|6a kkho>47П* ,v8_0an0 "X< ɇ0=A6B"οRTV*Lu'D\ ."U9ao)Yo 1hAgTcakiݖ@xXe;JyaL`Z/y`F(z(TpiԉRtG8̵^Y>yp44 B 4K"0."56b T:9Qlp(rs\9mγW9si]0=MUsx&JZl2' !Mח&Z{ Nbj)z}>'2d!{k7 Kr*C;])ILSsPXbPGß6*b"AdTSVjY/&bcu{59bFX*tE6ƁO0,.PG(X6Nrwpi`d7cul=q˖r,CbB5N~i+L1F.+ _.<]]5hUUX^,lL_%ϑKƨD 35OW%"ID޶s뎀H2I;c-h+>:nYՕy:*J>RmݲbX_@ ܮ+X(rqV`jeaNb2C-Ѓ.f9k`㧽?-p ISLO=ϙ)JƟp dunF> yܙHI$Pn9'RãPexZJRLiP"vX[):q<:+L -֓o7 ,h\b걫VLşJ + U9MB:"9 Jb)10aM`,Nx:f%4)# nMW oYk mU}ҴvQV]P/Q~ːf_cCB=|w{BrJsH$x1FO[QܽfEKIـALGg('F)N&PZSg˸ jdboE>QK 3NH]r$ǫ߲ 9;WhܰBoԖAt ecsKn4Z˴|C?iLpnRݯzu zZوy'iQ2wqw׹ 39A2LC[+m#_vH<HbӘiL=L#kHnэJm]J8b=^GJ/;-l7 dE%lݽM{Ej^*y>G ͞nZy9Oznb54|KѥMC¾ Nk9kqkCuc h;G굱î9C9 [ܺ':f eZ]$~>Ů?g7wtwBx+/V`<ϼBvϒlr u{wQeϬfZFmPh yְhia *-S`Y*(XX <ɡAAeHYrcrYF\X-2q8mYwoMuw>y;1PڹϯS!7nlEvcz6r7חM%C g! 8﵅$߹ԎgKp?pQgX,K hF(;\[KJXZq6}6AGB--XIRZg ^ɳ:.{^edW9nW*^4˧,g}3skz|oeP 0Dzd2`/?_5YeJWgo蓿O7n>- fD(G!KJU^'-XYKW隀L3I| 5083WEc*A>(/62\ѳU:-z[^_qat[^"гg?̧Ӣ]7G_>7 f5{ڻwoI[^ z߽7 Tnޗx.j1طSg?|蕕OaDъ6RfIieVI)}T\hc yY9:(3 :tfbQd/b,o&D<"8E v2˪34U 9TX rJr+o,d:Xu򄫝&r$eM Rjm&^1$霹,˸VZ2ǎkõϸg@f^_wg9w}n~,0k,ϫ[ϏMmwT֩^ UQfxȜ7 AW@me*KuCQ܁̑#3GaD6{/bJ%BYhխYHR4}R򓞵j5d6rc)}*P|u^H3ځV en54cz`Ƨyǐs'$,ױ&)8m{r5lڰ=ͶG LCfILWF.S\MR7qGCܰu~7z~LspqMZp%]lF-xTpf̍oߗNP5Д/'B+Op] `탒nf/-Koei*C<0%\Ic.sMtN6:lVz(=~չ{?-L װ0v*Rj7WVSbWE`{:pU{2pEj~pUշW%e0_iOcOޘu dwow%_ mytr859|!TEouE^m:۟+??~s^#E7ٵgKb滃 ;dJG`X(T͔ĕi)I{ha%:)%Q‡=㇫"V'WEGh ኤTupM•x }_Gޟ^>5\ՙo3B,4{jR觻4a|ZN_G_e暉CǘeJu6Aeb|Є˛m_ uʸP]`Mz@:0m3_yE2:0 O?VBV Y%d JN)%ڈRұ Z.dᘈ^H&QFf,0fj8Y*Ņi],+1$ʐ5 5Ƅ,G+P#IvNn|)~&mlx5yC& 'ߗv_ky~7ފ0z~7WN}Z/ wPӝ/.&g\&+49_gvf[Њ*H{*뭩x*bdX&4[Kz>(թ4XK+m84n54+3RmDxuŽ 񕋘+kl";H쁜;s>G|ȏ0oxA6jlgٳ6]@efWQ`Su~O]Bgv &UٮIOe9؆ 9Hz.x|t)0|qT#,_?7_Ogq~jzYxw!g݇#2 eb,8mA.rn\dH ʖ+Gu߫ 6.J._:\G׸H.ԉٍog?>ح^ʛ/^/ۓ$BK &7F\; FQ#()i 1dɽB*QgMO9YaqiԿJ;V$5uvO^hAn&e= +ӯ(doJHg=jB^z.gwU;-=*SgOm%1  *A uh+Rg JAD0VlEI_p`V'\pH {du %W]Pp)Vßg빾WZ4 cR0FׄoÇpԿl1&-ߚt~7?]_rdoY=[.՚w̲lX˫Vi^xYof7)S 6Hw*~#ZJZFinf[7S77:ιx+u9V$ehHwv&5GwvnG^n9d@T .,TRaDowR44Rzv)9RUoM)zDmeetVy^[gDŽ!k^)xؿ M>9Y!dk j2x&wEQ!%kcP0YmHKf-&'$HPzm Gd&P[gBMƵ!覘ЍKb}sl *̓[q٪([YɽxPk"׷{sGz˗Qh3|hN^׶~)d*϶TnG^]يb**oX;1X4M`p*Ɯl&hZq,(\2!Y͵<`Ȩ fz$Ѵ̲䂠'cXG38Ls.H*aak3X([BaAp.QiMaEj;ove`04?~GFlVLtBZA1""۠@@zaF\DtH:e!,57l*OD0(1JA8f!>U!eдmՖ8ۏqI<nmv<eԆ;i3kFD2  U2HHԊ%Vn $@ F[LV.b"fȁ ĂI8F"N*@JuBZَR?~ 0 "6?eDT"vikˆ85F'>"p^ < S.8tn9i,Om(#g|@IsɄ@$Z!\mjK2P{I5]\t,y(.b˸;\pq7%}D' Y&ōG;|i-ǠlF$TCa[XyС5LӇzzx7;l5fFpp4pG. &"Vdޅj&F$<`6"JhqJ+g1J樹թCVue޿[ hp?1rSO>ԾDAހ:Pj˂ŘP9.Z?{Ƒ@$ݑ0`,l&-!S"$eGY_<Cǀm鮪_/IRE#zr$B_i m. d "^FQמ&՞Ȏ.x.A>a=Oz5~5lA[/b|dq2>0]/gQOdz'MhB :UR Rezqw *gff(])dםtܟUw4(O/2 -&r5bn#F \םF@JAN LDy+j$Aa\5n3x]T}vT68Kk(XO=qSGK\D0A}V"xد[ُ_uE`\~ԃhIٶl[V}7qJMWk 'o4:;>1v8|aWP&m > fG*eb`:]ʝ[ʽWF}9ܛ p=|Dm[5<0IU2DUvRΠo'UIOK6Qe햁j޼mbyNDt-!T3WoqTPyQ$[&1Dk)\`L0/jM11 "E"bwH|H#.F'-:8Nq Frm L%҄R04șh(<Ď"vxUx ukzƥT2n6j_,ቊ%(ģe7# x|dL$GLIB$/Sttܵl.dÛimY)?yXQg><~J0?`ARuK[uޝF/ߝMsTzR7y{"O *X_(\Yd_bɥၱkT V0b}v9z"=dfݗ Scr$$RQfMyB4JJi`?`k U cJ#68R$#Z{1%1v=g~gH]SHGs HWλʲ\ݐ8KPҍ^S#UҏiNؙh ad"{i#$[G5%> b׎Z}δ1ZM[#c<o t.*dy>B DG#b("a"N[İUXEuk"ap) )0qZBeF$a8z|ȍ2>,Q2p9l05ټ݀C${bװ,+=|]jmJ#>D21IEda* J3K`@Z~Q#OIV^ag`I'HaUxP2I 7LN trsTȝBγF_{9]az;LѲ`bqi@\\]GEIW37)SQЂ`R.ŅҴ:ɾe4;Ut|\|-/];3 ʃ"g:\3s0E Wo*Y~MWG/~m3Dͬ!gۆk%?Jg0qkE'>TEiy.뻋zgtG_͊s,CAeئ9oT ӿ+q4 R |yP?*?%kӑ-k׾hIO }2WTfIpI.f@ P(Oϯ2+ g_ ^<f75~IN߅/$pMRmZZIt,A%Ju#?0#jT9&.4YGH(򹖜N)bxЄg!퓅DUbwI;<] R߂}Kw*n2d󙂟4) p} lf | W4V \9ٴ205։MBؒBiqѰy`PKqrv$+-m+J["bLE9iqmTQٹBB,[JL.0"~ ޔ6X- <`)E/[!V3DSkbD  3WjR+2]INN]~ez,_=v|?^myS\J,őq8ENb1r!:E:rL,=z7[YoIm]C٬&W4V2X+oʿѸ\7*2Ts7l? ?=h1pNg}VˑN"Icc}|0$L¶T\eV>4"G\c 8C?hTAtq;B;A]z0*h*Um>ƭ2(teZW ^ˈiDk4riyDŽ P0lr[K4\/Q $Y׷0Kxij FD,q 7>^m;v]gE)݉FVҦc!7+kh[ Ȁ.&{;gZeBW6ʝsqxuK-d<_t7$&ycn"L>|raypf"*bHV:h/*2Y\ D4Jf JrL-3w_pN^.ɟ+ >( m0ng$j5-Y$R+6vWluyU u(Ŝ`{ɍcHΰJh,\ƒ 0'T i;J%;b+t"3&V@i6(҄AYf_\g춫=Һ2 bٟ [eCboUI^ AN Pb{d[Xo8A?#uNJ%zKng;ßrv! l[`jiG=EK16MjxX JB#XʾDFpDCJ(2v,gRΎAW} c-acC 6dH l0F+ aRv#q+QP"Х|<"'hu =,[ pIT|!dƢd1Y@yxA*ML-7~R'a-TʨNGQ(g,(DzIL{$H2+Nq!u Q lX'O(X+#%E(bHЄjd1$̼0`K[w-aZscX';9nJ;ω` t4C0GJA%\7Mpcwem$I~YL b3Xl1;F#OYcT")*)R>dV2+*+#2`rPrc˜0fȡڡ k~wz;OM2+&X9$ &q6b8g9t՗`X o$(H9wX#+cCr:cgjpHo]n5k5״}  مoZ_lu"Ř8Bƥ}:·}T5 y U%R-,"T'ʮUb9̩PWGe#$at\`R 1ȝ "tJT@A!T nV<kZϸW27Oxj;K;{}i!',3_6 kmPDPS#Wb$X/4j%*8SeW?>p뺆z޳Gs)>_¹{bs`Z) J+'řJy|X6R$ BYY%.V澌g/QnȱjQ9)3XD`}"Tv{M]/Zu4E]=uu?2pqWh:V,yJh^%epE*Ĭ2V Em+')Șw8y;9rgg?Yyv樗33+UM D9iXlƈ9j(;?ﲀι :D,ז]`5tn#;ϕLjM9Ǖ)uv.:::RhvA՛P`ToMYZIϽzSRzP/uY/)_JBx΂2 w"wVHbnoh9;;9bcǦkƠZ|r'R 4\[YiA4DQcAGJƉlA-p% Û2 bR*4+$` N0BļuŠCN)W"`|V_GRU'F4l5G"Z74G6ot,p JH$2T/)SLgOa0=gzqv'i:db:Sl8RվLRQ64;+6{Y-/kd?xU9t*4~dj|-'MfZOvƍ5E?ZŁhڶ#Q>ގ!ԳA{㧫z_Bq =ƫڼs;B͎?j?r_Gk~M`F?1l~?~1dۆoo<ZSvfܻzQcKNNzZ?̡$_'g4ZA07 sy1Jg682o7Yk{c'6rw`qB!ejnb>MdUD;M9̷YZel0O'~^BH8JChBE&kmrCui-Gu!:"-ǔ={Ynk vE^<{mz-ѿOEozmqSom:i ][>}hZv?Uk8xE".4sX, 1YTZ}me4U~:c(<7G @Fyg׻ULr »7mat;;Erq-lswW׫s~5h!GMvm{7&V휳9[);=kP rHϿQABoO6ĦMu7^̏,S*g`?Yvp]{H!V{f>7I p;_׺6JۆG)w=j Pj^ ۄ1y3M"-7`]7Jvc# }S[eV'k:' D9L4Ĭ%`LLuɨ'3" hŅ'E5E;kd4Jc,o5ErJ1&*yb kp1qvk8q漢A]~ qs|S4TW DRpaN}wӕj߸bW2EEpqb4L8L±/DO$z\F EȽq9]oQ%[Q`nNVa o8d 9U$ aR sH(#knuHep9Y(1$>|JAigðmv0'0Ґ&_jQ AȫA٢ 5D]UvSlT2\HΩ ʃ@ eY!yÙ *K?,astNeBGJ $c"7)EH̼>h$uHhd2]t(J胴9;9#Ѹ!0XCrX8Krg76\6W bח넒,52il絯֒?^~-Kh)I@=&"F@:+Jc]nh⨳+erL-C2j4xг3:00 Zo]Qg&ic MnE*g$ <ʼnΠK99KY Rȉ5gO=+P._행kD.JfS.pq<9uBF4NÃHrk P3K``jwowi;??(o-zx1 mn:mL_ //8^[_ґmCWz\F?7_wWד݋ꮎnl2ogj{X_9hw~Vl`/hv,Kuc!zoq3z*l8~󪨏6]]~"!RS4L@m,qf-#. Pyqк_c8c)sV#u»kLF} yzȒ:㧈4&)T$"I"%a D͉r)sz"7ySЎf\&2F"ω4'yi 3s *V᣶4EKhz1)qV6z^EGGُJ2CIhNP ܄ ŴçK+p ɔJUz#.Z-^_yY]xv9<}p FP:~X[_.B wx{g'zɳoO(3'=>y:` !DPf ^iÚj7͍o:*qd/Ϻy{rqCyszLB2|uc_AQ뽒G,5:XށPAij v5% 籂 b=l]E޿Z9dJ#mR\l7%ͼ0&0gSFs<0b  ZkfMr*8\()+0F<)8# Ј:hh GEa\DMkE'r):9Qlphidcg,4?ȑuw}xK@jr}B.-Amw.ni⚸^°-k"{DH9pVT.!B/Z-,9l[)J{C&Z'Fްuzi1dPXbPGomTDɨhsveDǠ\/b}w4@6.;xE>/ Wg-x_ pFPVQ6vr~cۻ.a4Ywf]v߶So6E딾Mk㶄\-hoR=aZ`lEL veUfGeS_HAH[Ə%#-T@l]I`^ٳuVGOy$eNj QȫzLQD@NJx4$hk5õnk@4B-n|g{e֗g+"["f^޸+nmn~OգW[jl+ 9oF2'- LiE.ʸ's[*mvt޹fJG(MX2{ς9R[4y11EXnEXgiS٤q69f#5&h<#jD=:Y2@r)+sjmۘ8+:qyJͮ3 0>s3&a>BM%rCWl?]4w$05%1{*jj:ytWuGWI[Q b|N56*Nꖨ*qJvt7jNr\Q1S+$y(H@D 2r!pRU.Υ :$Iu&AUI !#zj 2ļi Oaژ8SUDBigf5X0e>g<_/g{$wLT)I>;QLa 걇?k:eGO֙ >%]ҤDK„ƩS=d;:αNC*k" ZQPJZH rȡ)P@,zЈY17[G8%3j HIV'S.yJrYgcxNN珟>={:Qy+M/gg^*|ʐ M %F7FBt{}g >WFvXBJZ(%p∢ʹj02ni>G}>R4QԒsw5i6QuG5\˰ҪZq;1Mێ?֋,h+&djV .ՍqʪP2RR,e:3z Ƌ kbﴀ1 ZN?O[M#DQ`KţDV=OBo=џ9Ķ=UV#oT-kiͭTnVQmPYw- ~ū}b6jIsJI, 0'2dy>^](TqD9_t权7Sp7;>"ݱ6jIu [Q>\Tkɥ(!~{{ŗ-dwo1LF5# 9yoJ] v3$}u)J$gLlT jʟlZDxW7̉KU]2zɰDȍ֢3M5tgeF]k)E<iOL+Um?Vhw; /Z)Pe{m! 3p: Њ(x`Y Dɜ2R\$PГI,ɪmĜ/uTSjr6g>.Faޒpӧ=|͉wdZaWG)Ⱥ}`, ܧ0,;ъ^[d|- ^=-Q7v;\zqFZo>qayނxqv_Ⱥ]1|99'>wmf~=o{"KկaVQUMEFǤ∖A|GfodOii r*@_յ}kq%alKL@Bk,>.(r0X-o$޺`eq\>hwުgU&:a^G m+LM>4Rl\2Ri)H/ q|؊I8@( < gr1SBڜ$T1c01niLf )DJ4h2f0"x6 ՆWzLEl"@_3|PzN(/WiimtjƬLf,%Z} u*j˼"> 65 K@xxVyD:#c7!k7f->"Tzԅ"6gU1` &`B̤x1꠼aV = T`GD=~aO=JFQ-DpĠTңOA{UjN9wWQQ+l ڄIrc 2,3&xi **V_u#NO$":c ,1Y*e AKB,@ $Zys[=:0_59LeUkmJh@2M}`'H\1FK# YSΑ}ӳG=&;Qȡ܁E'(.NpsCJӾL_y %[ QPf }ɮ`t&l2 *YtJ(Sbl:Xi }J=irxŃCשcjtN&q&3YmVR3!S 'Y4NJ].π{T£8uѹA*RB(eyH,Uz)U1 e )t9Z6:5s:?&WfGOlJF>MTHQ!zJ(E0̋5O\f&YdI] @t4#HB/:}Єے;j ̯HfcF'a"S -`ѐ1J1BR0^<=["DqgfBQyk0a)K%IĒ/xLЩH"_Ҧ\SXQIQ̆ZmȈ ]4ϤY.#TQVI$ 8Cf:H0`AI2ȸwJT)Tl6Ͽ\7O8IOߞ_ĉErlV:xMrN$K^0§Ra 鿿+^ߓ?t=Yn{óBld~fNX)V<dTGNGSnpt/N7$/O҅mf#i6G2MƒwY xrpҋՌw C{pC4|u6rD+ӄE~Jw+׊*~?mޤpw7W5𒄪%pKNݒ8#֭Anh`zq3ү'?~9~n 2s龞dzJX%Gۛ37/E~mNon .u#wvk7ˤ aLQ+FzEw/<<59[{lwInu\2`q͚fR:VD>]^oǩ/S~u{REQM4IcA8:#rÏӏ_~ϿӇ~O?#ͤɶ!}u~5Z]S%zi[<5MKPR|a8ٱ`{Ra:dA}FNd9I{>iD3-jŔ nfzf6bKYpoABm~6t8OiK<,7 /y嘒,Ny 8d@hlV\`!%,{ N[ݛp:M{$%=+{N\{aPk笠KgoDdOUek_#Ά#Q4brDY6`JzaPނKF%dtpmB3B&:c֒"2[Lsg+O(2)zd.} {Ԣ5YI cѴv(=k=[%ኻَJR:Gٮms+;48?ޒnn|.2ؽamN잦Z{ӣͭl,MAx>6̤rn '4 rx/chn_n&ons85wf Л͔Z7kyﵼxy-/J͞]̹ ͓E- w1ꞆgEW-6sҜMj-MhVyNwE/E-6ۜ3-~p.F+j^5RJ߀.8@]Al1\ iU^}W7k7_kP1vqqm^ifɐ1xtf(M"1.hPy@c?V@]*'@x8M)d Iأ7֫\uKd%))HSFL҄,&b"Ι+2"XƵ֢9XֺOqofc0f;OR__'wacwe[v4|^ Vzm&oN۽ ^_\^yTƣqMNmhtn8 DG|#||a"ދ`LJ%od!yL'KܭYH!m>rH_&Ny>rl(8:s/TvUK=:pGL1Ej2%_Yz1[Oqs~]]g5~U3 ǫ/W&i<$'еw}kH*Uy*nr-ʱ,@*Oʓ9wXFbѪPP )D20'<@bDbe dU 102,9Ґ!pa$2qLs.3[tVNp5 Iq)+{ QG+qF8k.PPvaDx!%I >5P0QD;`qm?VCc23C E']&*%đ1*Wq2,;Chy{~P~1"GDܧV"1ق3t- Pj:l_3KOl:#(8sL-V..%  ֘l]'8tP<+#rۑcH{ʹX\ƸhG\qqƻZHHJ_!uEB -kU ^Ȝ'Tdq#.c(xxdJ a۲j߳/n~|G1-Ag*<:cޭZDAjd\bq {EXIi\iiliQ&k TvkfcG); "}J F s@ɟJ3֧+g_n^ϩi6Ϛs?hWq9.#;zv$}Z^wrUDUt>z&C&I&CYA4&")!jȉȏnγ:p1NO bƬ_RD\RYx%J9y#= f(ۦAfǡ$ɍ1=;?6Ewשx-mMl/ZNY3g>) ƛ)h01X CL&4"RxĈ'A5aTBR >XaE-+Y@F%d6hA*hyxDoV-3`D6Q+Ʃ5AEQ(3 Cj]8lGw^.B^_%3tv@-z+v־["LfvS\~SeZ8aECΘ %beVF 5$x|:#섞y5䉔|$QV^ ('RޠbS45FYhVz[Fi<4zVFщ*1X1@t9%֋1PFZm1$}^_g}x#:9 0jq)Y:IzY3./?FhmNL(t4ڻ4ewZC3Oux89MWUn.R+6\'yq\ss[f/GY讦4 o&Ζuڿ2SOַPVcwI̎wv{ϊ>87b>Vb]byq)V.pnhV9lV9|ѿ ()RM4G}m|1Y -8? gH~w hvѕ\gyϓ_&6zhcc6ޱgrziUR iO}jP\|}P;n [6{͔Wa׼9twUVwgt3"mlw<ͨLˋ|nXkV.SD˞{ K z𠫩PZV#4}ih @&S9JUNERq r` l f;a7QaN PI " M=YSt!2i!}c|{Ы]& X38HcpXW(Պl%.^ąJ\098\VT~'~UQrE81-_zo췇Tx}{s]Lìt@Y :HtXZ;i#UVqq]0j|HNףdi_"hY@+|IZLN98jB9$Ssmk71O($Wg3F'tLcdF#Rr`KrrMh[%iYlXݐTBS=&߀˕N~et_QwMP첮CX/ɵWO |Կޖ_̡۪aJB}`de,ߟSw?^t6}ou[&G3{͉&,:/nx{_^e[?)ʿ? >`'6F&iob5-Jě٭6AtEFס#Q=;&'OO8V'Tp ʲ{sT8ڱv޺7I%DU2OA;NIKKȑl8NRįbP\+9I 6hрkR쉀D앃BJGˑQࠂ1Aēh`{~Zw亟q.=/Z3 |&@S鮖gbx:v\qneW2y UNe)$ʠW9<PjriϒNH IP cbm%" Hb{\"#>i&6 בb-MsNЁ+E:pݿ煨ֶoDf[\d;^T.d; cmٖx-1f[/w,Y(.,^@.{[H9EO0 :!& bdMX #twsD Xv5cm>?X 0b]1RSËx5`dDIyI 2-Gxuc~@ufw FPӶ/Jm{2 4ITl ^lzMY'JcI_J u)%@aB,rHRSb`Mm:s`l^ uGu6x(C% @(C,YXT;@x DΤROAP J*-[T 9?Y%(f" *msE#p6aC-}mns<ŗ./ %[A 8lWIJ6Uிۦvܽ|\pr;i~4E凓0ɠ 'e`ruWZZ#Uqz$* }⭴AԶ>uH<}™'-COI.9|'AY ]{B LWi4X,ԍPXx?W+3ޑbۖ}eZMz@r#6[}`D"}2Bb? Ec2EJQ:FU -& BFPH | |o) ]룎V\̡v381jÈ#ݧC,J$ |v1j4!&a 0>KQD-4Zp[ xwmm$ KPK@|ll7؇`Zb,HzxE"E=mQdsUI4k (l8JUbj4Wև٭۠~w(ƮhjqFT55bx-| \lpj49$CndP #BksAΗzRNVMq3 PX43wQ&"=RPF"iQ5b5rvkgH:^,`YKՋ^Խ^o9r, Nq@*$C%N{!;ST.Q׋O;kqGW%Ov`Zs7U^߳46q$>2R?}mEb,qF4WK;`jAР(/sE2cx2cx2cx2Q%̍JŴ LX)*#w1 @ "rdsJ HM2ET&D;)P|HmUjg Znm?h6=?}֚"PN{G/I@B YmB=7Z"\|AxճY:SYQkƼ'?xdM } zίA}8HC_n^ L 瘒bfˌCAFǮ> W5N52.SKPk!:f`ǤQ hdӈ^#r&](=*rQǠ_ ǽ7[פcdVIF$i4E4X3'zSG񌯦gW=[UOy6rt!`Z3%@(aY )#I1_,=Fɜ4w{S Zj&N6)? +!`Pō`!hRFjǢ3) Hn>^!+`zokEf?j8Eԃڑ0&p@?J|K8~׮ ګC=[|}._]fg /s(Ωٍ~x1&S8TtsH#Drjy[ >H)i?H\})=IЀWMIa?ǩiG6-[:5hf8-t?H L4%, xom[-=,~Pujco4@Ihr}U}<|y<[FU[nO[gDp4H)^\N86[0{syp:i9imSI?(`ILq^| dC^z5M v&o~wǂ; 31'+,>}un,jnl|7ӊ6SܖPӂ?r7:7jw Yjyct}[F{6Bvȶ#ط/4y_]\3tzayhxnu61Ow ~ZUiw'&i^ܲ@Q{Gyb4&'+0`(rD}>a{NMQcߧ9tfr Ig6?,Nj9ɋWm?Q>#9:fE :"2Eʆoe*擎/?!Yo@t@x\@`eBRcl3E$QZ qRYGtnu#M58# A]Y J˾.e93.z8<# 5@[P+EbȀt+0\"J?M4nJvVCw~*h漋\ː\0AtR|Th9dJ+kLW >E`b:+m^uU_g#JF%*r5A1QX$aNl aŸ}[+uu_OJܫ0 mWo}i9 Ůig }l2i?%S#R3`4zP dԀC'gs#gTi*t9T}pK͢Ԭ׫_BNGU{24#24=w%?u>wƮžO^]}Y70О2e'"瀩 \6͚䕳s9郩}>a^(X72XT*+Z@ay@L TZx{՞̕:\N㤥v$V+]푌uh@%⨁Vq{i#'JJ+=IW*|!κ`teϥ9Dr -yB7RB{oz'KglǗ_"8P鰕~דbX??w{!2h,dutk JHEBBlB&`/=qA̩#8 GB>}O^u:t9x})y=Dk0;]VR3!S /Y2^D)L\6gP;!Je2{D.K -@N"BmRN(3dC2Xy-rL_y7{?>=Xn]_}ІQӣR.I C #FhsxdRP*LP6%WxXI)l]:'`E0V&CTtP0JTE 21) EqQ:!R0hjsёy!D*"|pX^%DgaɗQL^)bȓ$O32$Q& 2F.sI[ >r.$w1Yy&"e/Hu順L1Nɠ$XZtJÀEMvFs$ƃPjϖ<ߏbWnʎ1?xVzGEx_ G8<#vw~?|x\݇?%Z)F2 GHЏ'A4bjia}m7KK2'͏׿ 't8lOC!̍ (qXFImXN@ |D3-j>9|YAly g7/RB,yZ{c۾zIoᨙҸb ?OzGMs۴ȭ[m~l58mq_ 6k >W[:gKZqfjj=x)]rk̥P7;tCW I:{`@/A ?М CЩԻRqVh?dGu)|ǸK'%+3ZA`: g;\Bzx*8J&.Lx(UER9`l׹|]'] YA 3%Y4,2gx2ZָCg#z=Px xg FC ːq2@jeIܗ8b#]fqwGvgYVukQWjӢ{41iOݰO7CSݹmbE" om ..k rhח}nI4WMV)T`Uw7{6zeo4_:tm7{$m]#MmM+qs"Nr|n/صڶbe;B oݳ,w']s Ya,ڨXPK-!rDcʱ\_2n%g;+3ơG߁@_\~[y6m}HCa|Lg Idk/r8IKyѥQ>"'')" h++zptw&;΍s,U^'J<JZ!t4éJdcRM N1LJ9EG%y,t$}`iM191FcFJ8JSPNx;ACZ}x__ӻCӻC;nvӫDP2z~>?W֓턴R,WφחyzGtPD].7F@]dE&sUk9MZD&M 8]:3Xv1ym!f2ROµ*2r-#+K8Xs`=z홹>vɧ1Cw C~z_ qcP MWtE;67i&FIybJ4Ez ֭i_ {p2eA` !hmH-B}*0۰'6|oa0qx' x!q]xƒ:yJqZiO^Mݫ|H븏,ܻ*Z6R]R=uK ~Okc kLfĀW^%i-//":W=O k#s\1gH]qDș=llNo~Z5ol6-jګvuͤ\\[. ˫nƍun^߾un{7O{1tOJZ묞kH=Y:|ՠUOXm3[lglymL k%A~ piFʓؚ1c7;_dclY1l/&D؟ہ֎b}%W m%:Afiixo&=W$ 9e> 5ȆMОєzs/۴&<] x2I$u>`2Zn<~=gWghumr\Xm[zg]of7rɸdG5BQ_ I]1#㇇i8 S}0|e|r`>fhZ=j PT(z)!InSFzz4D%E!?-})pMRoza}.-sJH/-~Ni.RWy#1NjINe*BneYp^K,I vE;Oڑ6bm|J:|Pg Zrd>16>m}I$E=d/'B˒ԧlD'5g+q*Tl., Ø ]bmmZg+rę eƔs bFI1+@7˒/;azjI:'վ\o/lc B[4k(:Cde*|%a>Kd4X0 Yt4!RRS\EY&Wb|9YS?{;,ØSbwr2O-]b}7Xi2JHXH&! " ! 1cUZT ]^Y>_0ʥ:+D OaY=T1?|suceF|"jhEvV%]PE%~Z'Fe}6 "U[CQF$}.Vk82RIgJ ,(Z@⦨bQhm),kuN&Ex=v+ )d%Z ?ay!UPg3 ֨ `fW͋A(U)u]zK,#a3)Qn$LL:t`_)I%T`. 锓 BkoB <[CZv"b8ID0um見h&=1; ѩb0!ض=EA۞Šf1Pzr\AkmȲRI6&A`F=%)R#͇$6I-Y}N=UBNC`3Hƨ@=?i i}Nӌ.κj@%Ds"d=L=R(XPdYAA6+u  +DH>?9 +ADQuL K o#!w:kt-0aF8_V(+(ư $> 8D CGrj0?A%~a<Ԇxq$c@ɵL rc,ўH ӨYM!TLN0H&mH@8T5W k=8813cY@TWd6HAnϡW>Pp6@va $vj`1 ap˂G9ċ聯t zCN ¡@[1>dPK;Ş zG܀$e)C1AGI2HgA ]X-SuW,Rc``& nC6YnDpbt\ D( Fr^y*(߁*QVW8h;UF7j0ݔzB蟇ydI F1*R*<E2G&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!0+L ޫ?Դu]^V뛂CJqhd8;md**IrG - ݒ(A#%h=:2 Ue"YYb?~1&V^ c+s,JTJb+cCB:r,-Ь㕁6W\`&nFے?;?X-@%MiNY:+r!Jh=عYvuk} ۹֎X?A@M_SN]sq\y_F>-l`poalt~5m]Gop=ee2.mv] KWקe*mttQSk6h))Y笌y!g \4t0+o(gΰL 6.z9Pj!DW\JyxKo&[-3GE<{U.SW_.GwbxCwqir: 磶՛whpJoo6iYacj6& rR ڥ͵(x>:KW77]Zh' rG]|+~rw5{sНF7uN}7 mݽM{o[whyp5mho~-ض_7B}e+mlv]LCtr>g%VJt~ՂY;ʊ )WfsD2X $Q4 l."uj]lѡw&v ^g}v\{nAk־9nB1 J D]"K#'.Uͥo#]jI> lxPrO%W%j"vԟ:7Mw+}߶)벧;ٗp}SZWl.o~rɍ m(…{aFZbh7_fD3[مGkPe>&eJ-(#amr{N𠢷bzޓr zx(i.aZ0 |8jyrWA, VC+tU>TLrIr8Rʃ¾|2y#YWv\՘tN<$JZ(HdܖS!%fANĔXMSf:bg;g{^U|Lj/^`AP"݊VZ^*tMijKo. 5dEu\P vYRK`\*{M͖=δdfC&摣vfm0X0)C" dy92N]>Fie+lVk!hr'sv6Yghg]}̬V&EˏaK*dSa x6& BdF4.cXd2=&Ё \3r7z?<4zIY6dUN I"+) Ys'0Ξ VwV '!NZO=gW \lfU/L";8U4&QN+0HhAqt8i2HT3͐ 1l8Shݙ"[lp љkc0/qt:.5A pKU߲vpZ@q0͒ݕHXWu[m(:R(Bx1ܐiT9/L8Y Oeu M;U%9)@ <*&9DVY= I@!݅h" hH>SnjA/J%/6*2Q1#^5)-C9d "1A A많MK!]O"sŜK,Cig-5]u&ѧz*t#7ޅ4K.0ޟ?N̯3_9x?2Nȁ4{/r1}]ڕ4=~}:<]HBˆY'a׽rbW_SBՇe*U{7ި7¯to+?L'oW;”YB\ag=gW&@цL'5[.v BZd' Ima10QЃK$ _=^>џlz9eZ_mJ6K2p[윾GWI11x+ F3/Q:Ǭpv砎o>?P~?~o~x27u=0IIp~;L{woцuhx܈ۏNJ\~XDZ8e0?#|U^L*kX2;<<%78 Hj1[*{ib}(XcѨցPD.?~d".@_6/\$J29f.qng^8P csz9eT_0s9FsR4h6Sʩ)'J3>0.G\ZTy u44B:,2I "oZM,:Qɉ Vpa*J'}w}׭._k*,{LY&mm`\HuMn[\ׂ=n.>gkI>/L;Ǎ4ᣱ&B'-Qy "djȊ7[#e-7 bQcu{4?O+G=tkSkÃzӰ7J4?:8n~:sQmgcpiؒM"4k !,~t!29[(J*6iMsP)ӤT_b,1(k?wJ,]L%UdeTTje-|}t&YGx1OI&-̢S 9Def 8W'x2Tt{"S)H##Cf)͙Brc.Hl<"Yv:yL3OFGTL\{uˑ몼]$co{;7_?.],X,sM`&|+8z/ɩ=[t0@Ak:38P7|IQPgG^ÃnCY=`iד ptxcI7δ^i2v9-ZU 뮁P"ާ}օ%rQ E Tl;X W[z \g5䑌0d%H}Pt4K\ ׺?}Yn̳__Kfχ[mMjSWzhTg쩛u"8\,_ qq>o .5hA&;-fFkDoFZ!x̏h!1r\S֣^[Q:WAM!{Pff )'X㛾Ud hEKRl%1 R9ؘ&amf#; ̽˸*K~FS 7^. h3Iɾbp'" $#I Ѧbw&8V otGv06 tRE_E[5_~%/A=DgemM­j\LfyzB #uLxA$T|PiE BXMFJkQ;(Js-*P R")2Ė.IjfaztTt$yK{o|X̼89-}8tX>;6tDE+?E~^;y%cMl4D.`Yl0  OTg]2xzɑqU-X{tx6U%XQMD."3!Jbґ(:ы@FBHE]Yr7a[U VbBbO SסJP>Ӕ?~9g盬\OK1B,&@'LK:O:Yt#bn3IgM1AB$2.kɐMPgz$i YK%F(g,ǯVSC>'2b"uIҗ: 2Yg3sA b^@A¡rD 37/G R,Ɯ^l9Ua(.f,VQֿ3 Fm-GjS 66vfv~(fZ96պ6f䘼;yωygo;4Z|}_v|?.(-{gTr;w4c[y!~W )첱$4(UF$*B4^HMX*|D-3fU',RqM[0HFfFv\6bncIʢzo*3s*2^Nz7mzauS$ P0E#Y-X"D{c0'bEi۾갺?AiUaSD̡pl',)p)1ɶ~ʜÈt# j7Ǣ64Fm<Đ,`ou4K!33QtԙkRѨl4LsեgsR$"ՙo9~`X81"bGĎfQ>i b8kM:ʢɹ"JZUˉwpXFT8RF6F i|9Pi8Uq1u6%"5EqokmM"*q@PXDe 5x!Q;!SDWHLh;.GfXȒ@X=ŮNoqF |ǜ(T(VَHُ((}dkx_Ȃĸ Pv8l GEe鲥,[d4&*L<IJ%Q)#(GV+c!,]LJ fE+Aş߇H}^0[qvg;My\/Zs|li5Ύ^i+]MJO69_LrQsf"Tj\0C1\w<Ƙ aCB{ CIkA G4^:,u =y7 %Pl C@!8$4'ŀza uIF:;4AJ}y`{:AyBQ ڝ< b$mRQrVhP{EX&t_oH~嬷},hA*:dԓHQ<6p+x# ^x3)a9贊!b] Q$t)&"z@K!֗BQ :bq}1e;֏.;zŴLߟՊQ>87iub//Nj˿,eWf\<;V;Vvb=ml5o>[N>ټGE^&'~N}dlz{xuQnΎjzo7#l?uWyyhv3r=fiV!gL.󮄚tҧ[A7ܸo}YCKNqYdsne+_x?~5tşwf4-&;[_h# *Ӽ˺/mWO ){eMyX Pzu2缛3ֵa&vASmυ悫SMEkvh0\$ɨt!6Wg`,VX4ӑ#Rh GUo͝@y?c٫_5Ruћmbgy `6  js^Cp( Hn&*" V4]+}8j%G([{ A1WJλ( bVڻ !JYxɪQ$蕂Q/i|7]f h,(3h"Rr`]T]׬kiYm6nL|uh=qǮ{Z}Wj]wVZ}Wpj]wVZ}WjZ}Wj]#bWj]w\j]wV Z~Uُ;HՎbȜ$YhcL*h9hb'lU΁J]..|, auxQ*G|N *M@1H+ƢB w sEєP]M'7 AG"YB"?/VͩTe3H6 >QuɨeةΉJKJJ(JEEbb3uD+(A$r XUDi^t؄2BzLh\Y1PxyvO_׳?h 卵ٻ8dWJ3+D=-̓YifxaB{?o@m\J2""㋊ˤN\; sS ɸ+!)8s9s;ij9<s2f,.AI+Ն҅h%!2EEQ9n bᣱ`J'-}%&DFE+=ꘘw]+rXw{hOŖ#5g%}m Y6Yny~!ñ^:j\9p7*l]q;)ύqy09JÛw;(d66cb.߃~uD.PKgҞKT"2oh3uyO ;l[4W;*rieq!Sʉ( 0DdyY4x5“h-W /$cIP>a%2  0ʐ8d%ےSU}n E5}-;/AoEvœ"9QiŁH1Nh#=(M*M3zӍ]NNcz֪^6s>Qf ג[91xY Q0|)Fxb Ase!X34 %&HxC?B}z'BMbjta8chQ2FvY Dvd/uċNA(|L׮ƥ,X!x H)D #aȈh\:2[WW)sQwE|ˋ\mZ1"-&64W yt|DG?'j]Vvz:TCzVίSSއ>\A~ ф?+.=Q#r'M(O /GjFVMA&tQ `R&A#~'tu fy Zo׮m׬Q0}n timKEp|uY2lY\A!9=[ĝ&v{Ɋ$~~iW^tV/0ZVJPG< W:XQ{+X-'[R,2J?M4Ε^~'(lh>wqQjR z-P3J') 2LGVέ-Y:O)GQR~|7:Ǔ؇UwȥIޢ cۜ#B !72Gf:r:(BTN*W&qO[K%a 4GAC^nW/N6y0X*2F kPB!t@hw :[yW/Z򥷗ĆriԳ \=Q9~J$&B$'mu>: ̜d6mҞiy/In}  C$l^zkdZfG5Qz sfZC;;3q8STM ( X.UM" B`YDpJ(gyr9;Y#W_^ o`diDuQTM%HHFkc ta1 cA@K^:O{ AO0'e# Upw]iE+"]JBHƣ0՝&g=Nq\xoȵFI 6敌D!S˜D@ ȣf2^qt8_Dg<~<ُQ&v x'} 89U&BFH,Ymsb17v^92py;H*S]уu">ݱVvkl)aem!g&'PZq ހI8Sxt'scl^da3yt|NЫq9X;[_C͐y}jx]61$J, 6y`̂J@Vצbr:r@X6ɐ4k! 0Aəe*Ku0nQL/:}0oys؞>Xfb`Z%$0-ڔX!5@IDr$IqY$@ ^w_yl$ EѼY/7u[m1::R(bIe*!kta/aBF $G%x֐JȽrћD5 .$w!PdnA{ ;Bc$@Ή2;&pzkڗ3)wl`S[3Aj澆R)Mn>}agqd`UWÒIU|'iR>T)D.ˠ" oa\>]gJu$!wȳ"8GuƦ x~tW $dppK nnQ;/5qS:Ssݲqvh='1tg]쒃mդIs|w=K,ڕ\ee]2| 1i8jt`rg׳/&nN2ؼ&׭^QjqL3 ) KbGL}5&핫 >F4;Z&5rӤx U\;_?8~o?/~x{?w}w᧷tv` HP'`Aݗ6i܈Kee&n O8?~0m0uX|X!)k8A)Voź|-l} 4AYj>eU[/FAaSbL VwShRȢDiz%vԹUA뿄B42hkj+;rt\Qz7DI93k)֜)bdR+q&Q)r6G-O~89qנi@A*zCGED8M),ĂNN[\)rsZ+1}vś4:쬡U 9h ^.e[;&%x001*;W'Y (GϊhX<$;;VG:\сhX. <; bcЌ:k֔z.hCL$]T'  F8YݣCmb-j5ZͿ]궧ah*bE8X._Clƛm𻫻oϣe_w@~uX69PKT-{J{JL u4J1N5cGdxu1Z8A"|8HR :GA%TSmX/]mKY6JP %&954rgk:%* *7F+BϳYg+`\y^Ϧ6Is<#[Pey}0ۖY;|O QQ׀nOoΛM_~-@iekﱟ*SkN*Ը!gܑr5N.ר>\$r4w1T"[z49!1Id4%XJď,~&we d Tk v4RSL1.5.'ZZFbAXj-&1Z\>܄{Ԟ{0cKgo6 O2K;T_'0-]&1=_}&ۀpۋS"oQL"5!566cl [/L` `r׃ Ӯ^p7۞i=2uBj>NH~W3; nvehIT?Ŷ$Ggo=>{c>|NmcPO[LBUOc5eP>&ch@%GDjߢRڏ]]U o'?76Ӫ̯YbcfjTg5]uA7Mnp܊:?7v2&ӗ+vGgoCP(޿ \[ZTCɄ;O6NDs8<< I~6PmJs{ vUj STE779zN E'%>l,^%ˀ Q}7'ld'9 8k. $jL{q]s;(Y/~mʶ~CϿoYw>۳]}L"^>cɋsOd N;ƌ(CZ(J$/Өs\mXBI}E`.KΪẃf6;ߝѡ iJob%nA֋u:klBq%s\XāqkpÁ0I8/ NJNNT,8B\X j+\(X(.Z`2%-j %pEH .DR*D2D;&Z#ՁHOF.1N L1®ە׿ X^*tTGE)5bb $HfEeKxMw!1/" z/? ᱁qXúLJ#wW8gum &|=%Nke]+(aY3Bֵc5IC u`zCPt/pҴnYy#|/6 !rIYn eXX@b08I ~ \G\8scc Iܥ1 %J$*Rm>1ϔT(+LXNGMz gdQ~A~=Gge=vXѠ3i*!7y]JIUք­SuugX9!iNuj!"B(eiRXޔ:iI7ֽ"B΂TZyGT6F)Өjjb (w[ HIF%S69'c0,&ΎB+ R?yᏇP3]PPa&ؘhƼI4lJZZtF ʰJ\(-Cf*(LGkԜN)}ZL 5^Lbȟ`d7OnNŮ<ǮblWq;Oo2q! ×8'[ ٖ7e}|<[L/[ܢERyjo8R4QjaYɢ,j5k!Ieԫ(JXs$H\f9iBO9 ɸ|tJ`U ( A42g;2*Ű ya,= o%m_ef|^acŶ=Ufo v8>#6ɇ2Hƍs JG= 9͝61$і[:!%e:4Oݐ=12lr'!(P*nG4kd1t髎RlGl7%TP0jC=}h#&VYs\ $aD9-WRHFs %A -CQ'<rTEaD="ЌG%D4r*p+pO2dIkc?Tv'- %הˆ5Sp5򠀥3.8!(Q{4ϑ)J%/%vDzq)u%*:!xFQ/pL$3B6VDdR2o Kcaq*xK 6Ն7y[O_5):-ƍEc\oL֏!߷xZ UxB\KXAr j vQI#}܌܌݌܌Rz*ƂHa('ҜQV+e-!ת @j =p|[1"owK>rcsus# @/Ύvxz#QR{2u a[߇8>n'p>>(ϋ@9P%R9.y Q6q:AM"-Ӈ]u珧'7^m Aw΋;<8Gsm}mїr",+M(FDpVJ ! :RW87Nd!r,(r7{`A0ULd J"I+:S$ |$1omNY6RUDF( 4LR&(řəsT9* Βj}-#zoBzL_||_m v`_WfiU@F3֚Wذfw%C&Cۄz)* v hv_*yw>gc9_9i>&^oOw6ܧ| t2LJVqy}.9S=] v?Ŭ]^dO ѹG |(#$$╆j !fzm{YwtјdӨe~M<USjҹ5SoE=g:_b~II X5ȟG?oz{+#Znn)>V¶7YV]ף6;|>fW\QcoM?< lI-64^EϮ#]B0,|L濛I,Cn 1Y\8鎅f9ji3N]1><|([y[\ȹ:O&A 6•cUl;3Dl'|qS_iez1; Q ,qyCӪN[Mlt(CN"& 1Um_KT7<վ7Q Na[yqTb:߇B}5gOESCuJ2< Qm~;Ͷ~t1&2\|v,RnA]{˰Lmvo̸;knd%uhohw6x){`2g!Jg t=JQy9xi Q[udbr( ֦H($"$ofQmXBRHl2INF>ۛBP.sQU8TRwB٘':X(KMRRDUsB a3Ч9;h/ƾ^D/ O^phE) 9 }SwdhGV-%ECXb<ٻ޶r$Wmv6E1l70XLwclQ%%'q-^=,+,׶Il]bC֣+H6Je{?Cp~ȫ,pR՗rg>IcсS(Sd d,jh]{=qp5s^\gLNFL2;tѐ$ Ŕйd\)"lT']PդQ! Y2NEΖyK]_*r/Bcv]0^MM԰0d@EVJ3-A!S[k$Uz?hv)bkO3U{qX9wF<}Xg[:ξf+H(^Ys**f4joBtt(rRcs)`b߽|+.v΋ ( 8TA WϫJFr$cj(7o6Vx-jO/jA!y>K}Wv~kj,Jm^: Qڗ!J&2sc$6p%[Am0lIk0z!RNx)]4!X%uvκs?{衜&peXXB8ߟj敯^8e:&nӘy]rUV_]QUˊv#+a9M h̚sLv ȢYfSrΓ{Lr-plw1CaXآ f^!h?ƜMD|At,gAjgK$XT$pmJ&rhm$-YgQZ+ ŖZ<,"YZOɛ\(V0Q a k#K69ĕ^:xo;/) 'sY4̅U:I:Hx?8ZA4=yRϓ{v(v52b+KZZ tW,,K'~d8(Ϻ>uk'ُQ~ӉHyJ3sZ'QEy}JVrj;R^ں;j/zr(W\·]xx)ee(pD RRa\A@/uCGe̎VT~0K/7("G,SV]t0s0B{?Z"$7&Ǘ 'Ka kFJ=-݀yT(_mԓ +Id:PeB]W`t: dzɔ#:dXSM*Cb m4. ɻ댽|2L[IF6eq5LS"zpڭd>K<-~EoO/W?j+d=5Qڥ7x+0ˁ]W+NYZֈxm5{cΠPܲ3pTΆ'ffWHx~noOh9g4ВrcK`n|S3js376T~f4n(:UGg׋ܴ9<N VbZ4ViވȤzqBB B|1L ^ePOgn_b\)mZ^*fixq/??{ۿ}߳ +M]¿|yQu4hZ#7rZ]-i?_}~>!zqˇ,qXAHݾFy_?oc=_$v{Bw1_+Elh:)nƛs #`@GIZ䲭6xOM8Or*bREK%뭹r)s̅L9/Rj Dó8,68YZkF}XpQ2V Ґ7縐j&ZD>u:9,׮WbΫ7Yt5عV;/c‚tgR0x& vxUd)TZ;Mb y00ZͶETQ.p^k}%B3!hD) OBYt$Jlf6{cT"ۄZn ]bib6;rnq|:`6m ͓suR~[?DOggZZb:ݡFoE۪J{HB[A[/O;RT h޳FDBcP"orV2ާGF?zdݧvRQdHjT QFT-;(5#4g$+ %pߜNflYXn7)c6)|'ßR:eeUE- I!RL" bJ')s54Gm+Z7}ӛ{y-{܁o{~2HTZ>Uv:=F{ 17).W3A+z4ِ:켽)Ãu{"ֽ=KXOVYTB"(Z-9LRe)F|5j%?(YS (=!9 e%Pbm1>;;s~NϹ聯 Ѱp-QW5iQxxkܭ;Tݨ|GeM'S:ojg{kpfzwz|1m.Bɭ]j$Pkpr࿩Twg nt;Z^=OhrO2D:m\|^W^w3PJr 1EBBX$˂tI,#rR^DƩlDw* 5L0PFr},EʢM1sLrN u=LaIEYd2]bIYu裐J.02fb"o5 m>D a"J5-ˋNL$oh)Rt[筘<WMkq1JȡEJpz.x?{9`@^a'4m>g8тE4Z@!_G؞;[7OA} }1~xgg,Ϫ !-'CZ,-PVbQsqvu25h"@I`|s?ؓP)̢v5$4o)ʨز-u}U˅W7Cz X+$㑒6)EaW3&I: XK^Hkt=k}Q7tl#nߣjysB4|^9 wM%칼>nM \D"ФuC5E9L;Lsrƒp1QMjuEiY\梬YH2iQEGu/A,Dd$j[dTc&$%([]:.;_Cr32f%Yysc7;7qq;mt 030 bz2x[+~W,0_ƃI8|)*_n ~$cw<^ ?f0^̦ 5'Mv(L.yh *FjF@DBPڔj<[`axq9μs j뛕#-ʭeQ kL7'`~x:lr4mQo_~ͮ.g)ST\vL&Gy,ZC`kR +]mo#7+?% , 0n/E>l_mm4ϒ=3 ߯j,Yj=I#j6S쪇߾)?'l}=]\q:5)hՠMX4F0@:qqQ7azȻEv! 7̊<'&vcۈ;"tvNʶۛ# qkjJeeSA@6[Ӡ :dh3A^\dY rpJT7 ;$% 鎿_~0=re r'M'Q ̬2sz> >x$8hpm;PZҵw&Vdcf4\J%c)`\:P-HwķL>xw{VK7'hHJpu c3˙۱*_uqC}|>~ZMwrɽXA8U *pPT*pPA8U ܖr[s[1G`A-U T*pP*:D TUA8U T/I+U T*pPA8U`|UKd|u!wnhTQ5X mm2 +ϠD&޾9J40f urb8i' V%OjN}!^I'Gr kS>+9R dLrF$0){ef{&cRLhsԦM('cu~+jPfdD^gBk6x9m%" E|\r`x?|(y͉>3NHkA ]VX-rmeb޸ Khp2g=ZQKSXyްLm}_pHs"M I)y\҂ԖGGA1N+`"iDcF^鑐a],KI6EB@Z1&dV,=ӣN`vǯ'[zƛ9-ܫm7ț9ّ&ge>֔[ER>ՀtL(5L.>>\!j:7بxvTH06Pj #5"oCV- ,{M6C": E:Xc [=$ f6o`s-!|6wуIJ!`QIg5y^R)eYXWJ9ճNƔF ]l&oQ#g|`RIɄ@4GM0ɾuB2gc=jO8o:^god_\=㢪Xqo,,psY&L>Y1H51pq0x Y+=Pl/vܺYr#l0n(lj e?j&?~mIT1Hy# RhU4 "O5^(TY҃ʌʌˌʌ=GC hdJab+̅hC9g c젽ys@exHmU3y-*ok7ʿ_ü>kq?5o]+;zv$I@s'{|ZR/z^CBY#5H#AQ[t!}=_X8}묷ݕA fԓHQߑo7puFė]]ͼa< 0kz> @6kA3E @Q2(b`\:P-Hp>윸bImU_k?Rw_qN<~FKߧ&S~v6U}PGO>h􄜚>zoT uIQh Aŧۃ`ލב6F-/W?gqԶǹS]mh<nj=ˋB(xĻQE@b ȭd횞&i7^7$iZ@o7bN6t<]Fg]wCP?pb>ʐ\4SpsZm&@ZIp Wo9ޝ&j"٤NdUraVۧΨMy ͗,tn}zBmO;#،nw#xRF5aks (Arp.'OeGcǬeRM^})$Ϟ2zϰNK)0ZJRNPY .;AL"ecl iCȊ67)//qh$V 142~{:f6"o0#- >,Ezb0/s^/zN\iŬz2 "'1 ȴdhZ}rp.*o/Vui⯧x"H. jLCG3ׄ`R6cF' xl_ KֵL9i:V7$g8?ц{{}ǩ+P*xa+1XB,=ĤL:& mJ*鑘h7"6H %bז}=3ߜڰ[_AHyKtKQĈr2Gï7sςC!7hrkZ_ƙ,ߐV S*53=4z4׳c²E dCQ2$D1ge*Q+,` γ,WZ9/$g*5 5\h* yp.uztvԳV lEPiOK1Y:$FL%z2Ba {#K.t2ĕ^:P$J߱}U΀ o[` R+(1ɐDﴌJ0t#09ѝz,&/nE;:}:7A'ձp<7i&~xbtcRJHTyBBlL&F$(N(6$rn РFϮNK}Jg{#ΑE$w@XRVX+T"i@(`U]Ť-( DR)pFG@%IYb(KFMQiWšb_dO7<[=kv:QcYXYTٛStDZ02'(E#Юm] :Ǯ_ubYsǢ]HZ%ʷA` 3c@N۪޸ >V[q43A(N6ĂNG%oi{ d6lF 0`3{"ًfziTi(Ys;3xQ##C#?z IwaK=ֽ"uH{(vjFT˕zaDrhҥqA2eh~XR;[ǩy C8.ŸGZ5<΁liYn2(ެ1``pI.me]mTare}Xl_iw] k`^#ɝOozZ7)c,xfLع̑u[F()$+х8:NGa2xvs4b3x~Dg̮l1Bx2$!(IN?]or5 /0mNZ6}]ҽYPC[Zwt3)OZ[Wgx&A>J4o1ݘG^K5Sv!tV$@fpfu{l~^׫ٝɀPb2᨜ ON+q+$,h!> ҞbH(ƑMè8̪B{4nmxJn^=xtv3_rjM UorӨUZK4אϷT42;Hww v[eig /NY}S嫷~/o/}^oW1m{~~3qAu547ZY8 q iiZ?寇TW_oˇRS}-Fy2#oc߹Apg1^w 6^ܚb0#bwv+FK|N2OA _8N*b4EK%[su!Rs̅L{ZMY&њ~o8̍8Ydxk9$O oSJ?yύTS S`-N'u++LR_yOY隰sOXէ =nvr}Hm MAYphKr5(J-i_ V9Ū/.g{6lhƧ܄iv3bthHR -ĄPe%(Z UR죽TJ6(:B)bU6X"X )(Jt`*ItM!t{@+|/t.eGېb %7Y>I|g38>dA]cLl#9"'%Hi4OuZ'LPrAփ'WZ^=n9{$Qōeo3Aithv!!FmzT$ɕ7@*2J6D/nmY.~KR;@7/ö=yN)9b agSbN-?0qɦ0r[O4]VLf<m<}2>9iY,_5g'C)XlN;w7͝3f3z[0c/)$aD6U & aͦ:d{rTCV4j^l0R/gWy4Eo\]_&Sx ^` p=A vkq4ef|}j=4\3]>o-Ĩ2Ȇ)IZD)ʪaWOw_*5J{zvPXVz@0-CBsfؕJ 4uc= VdA7/T1'CBT`Oȃs;ֳΦsC#ǑnT|(d;zX&=! cK(d.9dsɈUKx@#fw,96 XBUoB5bp̗k%QCBtu'1l\]\ S#-dB 0 ވptb8~DW2ԏF;D^4G"C^$JBY)H*)9C5P3< k[w;"ӗ*T*{gq͇`Le& zc-fQ$J |NP*F&]>ں.J%ЈGV{el#zH,tQv1! Jf[˛l _6'+?*״SjɬEMbGtvF/Jr$uDt}G5I.w)pgѺWni=??{WV o=s%UDjSuiWG_%!qLQ*-G"f(smrۿ*jE=Ԍ X<U).\ ;E/Uo>ZvcU\:k<8?boLٔ{;u)zkKϴ™pD ZdgV/tqt{6<Ñgglc*DDHc Ƚ&Nkrsw6woiB})=>_dzϚm5; $+ .?\N,ffu.Yڐ%DԀxgBMC6oh-'T'7`<Ѣ`^ѯ1?O~x3X|>83Kg3p\F*7HHƿ:oa oiϽ7zsKۚۛaUh,֝j<]NpkpmmUݭ.mcں%2\!aM׫Iy3%U$8y4UJ ? ǃ4 qӯoǟ޽ҽ?o߿GuD30T5o.:࿣_5o9i'[5M-V֛Z qGito_n>8NSvnB,qXAHU_<˯PI/ -]x0ru7D?c6ߦI}w}WO-}C^:Өm;p{Pl@ ;R= Ճf¢AfbQOu@{W2~LCe#kOF^"FRb)T" Mz/x)s- hKX(5iXg%xU@+itp6Qh,DHS?y"8:![*[2B?: z-cZ]s&L/O֪JЁkœ㇨J m/ZȔ< zn>_\ ,YV~=W2hD(ـo{~Lh:ZL^\M$i#!fvQ l 6N3>z~DӯPD,'# 'thsp >rgKBfa"jJc?1 ].:ID*b$-B,B+N45Fdc|ج;,RC;b5@ar ײuUSu^Mom { Wـ {pz ^skz =rh'wvmz>`g'CB~p3Mb>0*ͨt7ZKﮞsLj}407i=\iт^p}|8fNG@{v?KbQD ( ,YZƎmQ׫B]_ ֋\mV@bI;LJ5qC 1 TgHk2Zy*Zm `27=qϪc˖nSwi{}@'sC;Ibn4[ع) bQ ,d[w>YAPHM#əF2a.Ts7D#/g .IJ&GMP ih@CH6e[S|5k໏Hb>N/f"^sF]ι/RΒDX&AVXk6.VuACP?$l264F{,˅[,+u*'Po@}~fQ=6=C Qs9*H&S q)p-.TnDQP|B]?e BEb9bjQh-wBm֝ 'A1Rq8]XSh&'XC%mW=w<^_ Z^Y5=s]rlOM_M _-JS?fM"H`H! z4.F NUF"D@o R$T3g 5#A' -$Ճ\|7ٕb^ PHI֌ͺ_3*ta3x.TuuQuGullagŶ? v7O է˞Xcz(o' 2$Y@-:E*X`2µpMC%{!KbJVMl CB܎;S5C7%aɺW_cǫqG1OEk7jmh=%CEM,aq3\Y9;Epڙy [%bȄ N -隌<%CtTEgT 7~}ؑ2hfXkDk^#x#Cx nU9;E PxP;88N D ZjTق,U4hUR"&-,jqBZu瀴 H:^:fRXhE^/xW)˓ċԅg= p&hٔ`Lsʢv^<^<}J:NE>%*Db[Pt4E\88Fp3Y?ZN0MP1UĴ>T*@a/EtE8agJ8_f+6cیm3&<If5 #uX joEی@jwAa^˩8G[Ϻ8;\599#7h<^jc;G'؈H dHE2DbLb:66"Qmވ|ysz,6@o ϋ/$a~MⒸQzmT\S21'8i_bJ[0CÇ+,6e->ǡG`Qtg:GoOx,ik?u6R޲h^oY˸TNe^ ^kOc;O6t4tAUaKuSw=-5_9d1A2TFT]DIOF$i4"V;xi=гC yB. WEkx(%,!8ehNAle5Jy< O>Vu]'_~WB! C/n<KAb4R;IE,hP-XοYta}^lѷzZcPuB݈ ٍ.\ȁ ^ kt`4G0$b?*FhpAR$MT"# Wea08o?=n~W0j)~~ص\Q{-38#8m2]ſxx2G~6X^?דk̾UV0Vk ׬\NͪE_~zz .é9uH6)rxyۓI56 @p4Gq57NSSjI?IB6OO[N(]wf̰5Yq/1̚o2~Sϗt~k.x?f|Z͏GG ~q kVs򶽜"SzMۺeuzr$ )o暔Z{~ึ%c59b/:OO:'V_=\BY-z%N'#.l4&%4]}@ߌҤ(xvtXnt8G;|ѯؗTM6U3.%W.bm= g=8vMK͛IUIO/ f=^m|ٗ&)zf~ѷlbugɶ9[ϫ,6ɥU.i}=V]{D]9:kl2g_O p4kx"u;6{kuz_Etu3Iow1 5: 5>x2xƺ3Zh_b-cVs۵n䖞 m{avm,/J.!Hfȹભ)*ޗ8nQcV)fƊH0ǔ٭ t>߲! l 7D&+S(e|.92엎y"#1Bp !գDa_~Ks\n~iNsޭd~79"2$ qR)&oy%UZ8S+v n-lCYgE z3I"JHU<kʊgFkcDebXuZk(ء^k3X6-O-jSWqq.v}i%[Ů'JŴI ~҄?O&iyS|3$@qҝ%β~'Tmlv<8T)zY+mX38q33>:_ݲ23^ւX\jcRrX{=>-¼,t_OEx2u4߽x,\ɬMl Ase(O17?>BO`gw@YӹЊ~}=`1QYH4rF(Z+NWwd2f7;g e52h22;=r>Qeƾ(̝fV9ٻ&NoϿūfL:h)q>@~NE.)Б'{~cS%Ƽ&s 8R7$_^jh2 mW۝mO{Zgܫk>J_pw#@Tc͟5]韹 V&uxmFu}\7fi?_aǚs6uw5t- m3z<*g$K7o1q zpH{_{,OYa^owu2*Hʍ^7ޑ!-m}T GSQdnTpYT=KY/ogٺo+6/mܰ3|!OWم~9v`ͦӹt~yW7dwe_IabCm;|Unr8)X_֥<:F&9Ngum%n[ݮ_kt[4A+OJ1. vNK]7?Yk??=nҒ_iz7)][<>PI3nm9 +k*lfL,6wɕО˂ 5 ٷ) kt)kz6A^ ey|ͧ٩}*VMZYyܼ. l>@&җ:Ͽt_x8c::9Mae*ktnGl=ht=UZڵb1R8>MǞ lpZV+y}gL?+Ie$_{{˽tZ(fћƿ*.g겍\`\UoqTZn@r*s{bXzp;p;i.{ñxhNOydroY^oM\lу+[2V&xn"-Y$bn/ qaIx`lD'g%_*-9˙s l0K2p(sn$g %q_w; BJ9I7UNS }ӥR)MQsf(2DJ0) 99Z9$T 3VX YTc#CFG"Z|=&]Ak4Cɑ/2srdKF !qk*F_KhH&!L75"mm RҧR)SOA /ͥ;4@l 5#1G~&vYc}"*X-@I6rQg 1ZqW^*rr(Eb9J^Th +/֨;pݨqU08x$oS[[{|1!%2'ChqjHab $Z$ɀ_4E )U) :$K+,k.Q(-dʹ<`̌wBZ͋/$9z/EL+,'>rƋ  3G^L.)0`_| Fc"}ZErHad$\x`fZS[O8 qX%1iU &JBN ,tЩ%f2Đc!ڰ¤1 Cl)hP` Yhu2$Jx3pYf!0#;^a2pJꬆ7L[P1c_U&xs"orBd=|">aq_fHa YE%d h*' ϊH8 d0!^4<RXZBpqϋ `M2KU9, ֊4,%0!.'7!9` LVY@ C>DB fE`ׅ&RQ5a)Edi0Lf]0E!LL&FW")(ؙ"KI( p+ 9+L_@JN 7%b8b .p3SVİ ,1vRl a s2Rѕp0 ]R $1<Ǥ}M)LBT#2Bb|)Fr{)b=!C A eЮbITv6#VZ('236]qV/[:iW,1WW 98' W#FaR; ^%`ąU^o.c+(Rz,۸ HP6:, dcQٮJmՏ/XA}(~YQ"m ɜB]V0N@+<]vu~!CLm]LՐZ"!VX;KnCz=pi!/9(P"Q1A݅Z*@QDh7QA1!%tMކVAhGcM;fCd`: )N6G]ƪ&ڜ]5H-kFV8-5*/8t(Zi2Б98kc6Fu ghH0F/~ )XN$Jjƚva1ҮO4MEը Rh›WT*m-=+Nc!-BZH̤A@ ft,2 cmbHuKm0ZwkҘ{j :ޝATi,.q_ܤ "ev8Qc@sJ6Hs#SZԨf=_85B'Q˃>T Wmg&h 'k廟y E_něZ#@KvG5*m@<2 v0²f@bem^0""5j4:Mڳ.ۍ9ml!&XKo&] 120+&Ov,b CJ,;)[`4bM(uiU~fW$<XD38XH5aWKBvjZ8o dUyu@1|q,/嗍؂8e/iDe(呵ּI۠4Y}O)=x['>v}R>lP{:/(_ u⥐@Za =@F D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$KoO~%d$/Zv$8C$H)}I$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@/ K"Z|{A$Ő@t.I "H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "";&L$D\r2F0I$H9FՈ"H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""!n/GBsie֖ח}Rb$/fC ]sq>6$E82z$68b_,l\gyNf ѿ궗.' 2I7%}T4Nq%Nj)d}+ !q>ܧl?Ht~epX>K'39Y|o+\"%(ڕgyqAIoBwڷhrSkw?\]9dGy3<1{`c|QG:pI3G5ԜOuu˶0+S㠂7ʚ:5PlYuϨZjcRs^̧/ JՄ:kx?wݺ1k8l=kQmYPJ(#76Z{Ƌ]#FDo}[y n\:y98|Z2.1#7;~NfSGog}Hji~1c*C9wĵk:3In2(b6Cmly-/m2EO/혊W-I||Kn/709_]޸Oc8țO $ƛEI<-RۏrIـOчtzl|0g%]-&ʿ6^QmjW] bVÒww]h16}L\(33=~s=:Db>ZuGTmz- $]IRy<;B7U /2rAkUe[ĸ{geͤh ȬK9w˴63o}Yuaf{Ȼc۔v$Iq5.Ơ;\ ) ?:贍Ά!%<'Sb2XTݪ$消汬eʺ?d)v꼟uyطmSea*pކJga8Du2PRrی*[1YUfG6,6Bi캺{y|fsnmo( qCб[nzJs>X՝<"۸k7{",Ӈyϗkw_'iv+7^5755Z~Zw;jg#(ZP%vf)f7E }zU?^t׫cG[ٛ]o +_K#R. JvpP[}E-_L)#w⎽X݁4z>̧v6MV˨3s^#P屔j4iRhmR}6˴H]f:nܭ7GcGaf$ H.x5Ƌ!O֣b/Z_u}ŶE +/,-2q$<7A*T_];H:vn<-<%Y*U1^cmrUH)) ɜm2Fz̓PZ&ʿRRѬHL0p:^8J'Y-M;SOZAt5 3ɿ.M(p;MZm[+pg]{k `瑏];Gn m$n =l=nl6JtӳϺǂ@^]4 Ġi!CG4̏cmPm;MԊXd52(+/I(]\gM^!cfݢzi6^t5='=Mfe'j LSmLw{a4Rfr8ꬉ'TvՔAYn|@m$G\&DStG:N3so1׫c/U\"*y6CRdcRvU=V:sϱ!X_?<3M>R3+R)P8ZVS)ާ+o׎왎9wwv=AB?,3t7?9d*ycjɮPB]<w%K޻J/.]{Ŷ4ޑG廗/ăP9 펛kŸTEqU$"YSQf<ٮǁ#c7sJX3s, 4-jv~є&-W,~_b_~`tAY7cTApV"$Y -:ng`dmR,7dUx Y0Mil2blEmTۢԛ8He;gy<;ڪsVIxcyi.FiuH)0 hVX;`arڙ5V\u4%TP\d5dD.GĨU+69wñokn& M%"vC#5ED<ވ#7k_՘-zQ+xƔRCe5'C;Rʐ37TpQ3w2[%j31T1e< إJZ( YUjP9X#bXuc%E9.zkH˒਋BWe􅫶]b65#R}jUv".M<̞ رxݘ# |Ǽԑ_\F1v"8ZV?Gc7OlדNyuW/ؚZax4xfw8Ő_ #&NE|:Ay07XWR'vŸ塶|\e:*EE)H'^&jic f+*C=XDt3k>oonY绅t߬KC>4l|r6/Gۆ}ͧdz c+#aQME˱]o\@K'mˆ/xGAPcKuSLĞܱ';Hj]mMNۈbS2}ѢU[Zhu;8[zLؕEao4ځ%:G g5I4*kbtrQWOWOWOB9EЙ[!#(0̂IBLNIf|n*!HG02y摕v*A"x sIbv&yM.UBAfU9?S8q콣g^=Ud}F#/jyAp7!99CJ4&+b}X]qZrv1&׃o!!NN[̵(tٵ)ɋvɯA}8H GޯM\*nupXL%LcJ^sy~Q2G]W~o%rYxּ-','HlR{*4cJ =dJJh\hDE#^ظW bxŠ@ҍ)L22x%"ХX77:خ3vGaJ|8o|^>[;hiզĊ!9:;KBgLeMRZp0hHLy+:g<]^T=4!;\gcK@`p,xm9)._KC2'ý^J#?_OU65: V5P&AS`LJ1Zi<ަ, @A)g 1ޞ3~X?%'"n;dXPCD).0Y_ċwNA8Ɠ$짲Ɓl2FIg.d. 8y|K\^_?Lg#V;WÒ=;vft9?9,1\FO_ߙ᪝)f2Fd hs|ﯓ[~UIv]"e-F3wܰAgrJV Zϫ[l 2H1 sn{=*>" $bU_2~Zw%75q=LmRHi%?oahDFW?{;]U:4raѶnpM_ W6 !M_Ku~8Vꊮa|wgnߺ솕ܫ0"zۻ> c21mCYnݰnedϑFbۛV~#pܸߒ=|*"3븬s5_O"ڦ`-wsW>q,~MەjGNnve9b+|?5Cw%e8Sˏ9eFTTlr5c*c"}ot'痤>+Js-w\jw08M :n@㜉˦ ,dsaMp6nn]yάI^8_(O(rMc}{l3ǞzL |0\OCQ>\ӰcN p5i$u',ر@ݨ}4d4&'݌bLlG0= |w_NqoީXY>VXWT11HN!j!#*k-r̉y"2Eʖyms:Xr*mU i7OjKki 122bAGQbGɸ'FQzc>oNz5xhhU“U/=dV6j4V>Z{|+L!<4YhdV&:'F/Tk?=Ǭҽ f缄Ԟ <}Hl33*+ct&lKZ~?]jdmp078h\ek7a En}U柣whUh߶+u%%?cKj|_뻑9J6fXu[:gKZs@53jf)]grxLj㡼na+Y+I:q̸Fa HÙ N+{^|XR2Ɯ`0sKtFHt\>^V%^F:qR_B!EcW7 V?uj*nOT$t`Zh9Y2g2:YsW3}1n/fw7m!!OU+47 ſSrqq125;ABF% Y6ʒIaa=Jhk)NG!0C[Nbܡ挑<"" wQi'XĀht9pܓ%~լK޺oyq :^Mψ l_"=n+s5拉\i-Wk,f^&WJR^πJx1pk@B# %9%xqy\]{#}q;CݸGτ,?֮NRd1XњFyTWN7.3c$IrXBO3Au]I]\y_+z4[=bV{TpԶR( 4kq@*B02ȔW\0/:eJ`I+MO,䅳sC# Mg  ҙ A* 5DHSh!6%M h}6Rrnh9MB?)O(D6Jn\Y}\jVr"zQO&+G\QnfFUcL& ơxƼV[ T0VrEc(w5rUEJP{M Z}DP$D4N%ͼEs]|Şrԋ$^nDG.@F'uHEGxgI2-+%i$Wd&a e )CKHI)彵ƒєArX )/M S2Ks`$LEN1":c% ^Vj)g\٢_F-=,U7KT)-`CŸbڹLֈ,2E˰E|8!3SGgfyE _hDvyUP:`WȼJt Қ#rUՊazngxfg4>vKytI JnxBe\&+T d3^ZfGqRm̫Cژߊdv|SA"|Xg+ #;F9269p)\Bk$91R!Haz'KgldI-pd]"+4Hm.92kNo4qx=%W_>.XgiR`)#W֑?mtG+H"0 HUp 219 E^Zx8uCXa[0h3'c9%d+xRPЯ :%=? O?bOeJB$+ 1ȐDٲ r /-УwH5r.$w19fx&"e/HuA 2{-AKBI)`bѐQ3qB1'8UWh}乺loik&(?ee|L8/idV\J%Ul0)M0I ]2*Du>L~w7ذF]2I`jy^mI nIW"?S2ua3NBvvm,LΦ3Rp#Hc&Cd ! dr;VjO m[7:[GB>ru~AQzWU%*aW7_^~5{jƒAD /1 o1i=&kdBpvJ|u%Fa̯Eb%1Q^_WVʮHFe_rikmH; Q2`[9 8d!&,${;H4ވAئF랪>I#yaT0z*o K=w1 d-r&~])t=hvvUG]MoԾJzrg]}~dO^m;O9/E͇j1jhwo3MPZmӧ/|]u|!`3!ˤ~:;y7+ 4#2Dϸ?^Io>*]@}ǘ| ctn2 CW/_> 741Z)`!6 c&r,hهXv+sna~4]}X攦MU5%-48C콊E\50X6r,JqTSo w}NLI+G{as5"$E9ڔSo6*eRUnCuJ]InCy)g;{q^&\[M&||k5 be-0xNR~C?&ˆYw 7tK;Y l5a0 }֑B ʲZH21+I≰7ܜ+{Y”/6ꣅIv@ea5xڙm-"XeB~NYۼ f^X=$tN>ץ\],guo+XЁooɋxOb֞in-46Q{3tV_vgrzځdJfRS2YNS%5DT]_C\5"mzk'$,A-)1O ~ZĿޢ14?TӷZ> |A{~39;3܇ox`Ʒϯ_>LZ0)/KֳeбBwT9]8[s6u-P}ʌ~UE I5vٵQY>ztĉqs|jɆN?67+EN5ޜ={)?L!hW|n|=&ދ (z\zb{+GE8||.?^uIGЦcW6!;'|o2ZUjP<7 jHa#`V6@'N |u9fVcᛯyk&Lo we;[J{E1k403ƈ)T'2C;,')A_06z(pdP,+ ]45Z.*``ffS@yE _X@hPaAZ=%E@i$A2 !U4V8 g/,9h8=GD䢃^F=Эάx$ ې, dm‚չA,TGWgjbQI9m,9xY'OV`mtyu 7p8-f^`I%>b=^!ExB.` "Q K ||("Ft3 'S@ʀvo4KpoK:G!;]xdWnĠ3(f4 bduJp|A8 pdI..x]~am\g 0S"Io&Cwx ZA 7Q%oCAaI6c;*gDH! cȡhhͥVa"A5 uV4Q8FҊ/3ˑlJ& QRp hoP[ ABS5y+M{-9KP` _ɩ BhSvn|.7=C|:/1-Ycef &\4"qPqt3(D4Q\ۘ"mwKQQdfauT5fgZ Q"eѠƤ@g?뷆 rPZTH{8nX Ȁ5ȡM!ds \A7Vh-6?7 N(r]gd*DE2 H*,m􊀔kzL Gc Bf 9FBiַ7Hnzfžb ep!5J4Xf=P(-g/U3d$0r$jd-µN~Z-݃4-Q#b P[;ih 6]s…AX)ܧtat0 -!VFB;?MH Ar|d8NrO=b8%J ,I V/ .:㽱l,1eXuۥ>X`eSFvL!jIx dCt]XzŕZ 4bˠ2ˏp]K|!wTu)Xoз,0@5喯zЯ|.n.\&-`Ueښ S[L!d8F ~+/\}[̔,ZUp:[ry޲ħ{7o-5T*pvQmSU;U U;T fR|1Rec%пd%j+Jh+Jh+Jh+Jh+Jh+Jh+Jh+Jh+Jh+Jh++#G/ tJn`zmNjHoя>u3*:3BN Y:βzE)U* ?H/D0V<~`|0袼ePN04 wɻ <հ6J:RLf߿#U"W"[l:Nk&ta-+mt^Z{jMU; n/cЮS[Vɩ[~\L4DPZC%l*,D"%6Mf+ȧEp6|Y0/綰dzmX7Eu^ϭar+ M^Va}g|hZPRxž׼UXC -՗G~烾EqQ۶]`g.=rj:9P`!hmLr; P͸&ςa=$qM䅄e.Xk>Fi%,!ZiX4c;"pC;LjE]RTx5l"ReS[ ϳ$A Ҙ 9rn1 H tiOH#DLLȲ rZ8O bP֊Dn Y"d j$~\&ӟ](YvF9L>Mա^omw΁x?>hGC?Eh֜U&LIb2nCP Phy61Yhm숀lq+ɩ_g*r鎍OwyXcC,K֙(g&@O"QI׊SnDo%q<3өXGgru|IYXpsEPTmrإO<0.(̓KQ\%[Žj4 ]`yF%vlfG[eF7E6J|_6ɐ4k!]@^ƦIL, WREcsap_jB^ajBWh+zG6R(b!9 -QP0' w?jZ J4r@jBxTgMrPro\{zy V Q49Rt̞c%em+ KMJ9C\I#+0r]?I^yk/ϟ,7dI^?&+Աق9vчr'KzpߕB5ƅ/äg͆zץy|G<&8o{GwUƛQSRJ]pʶ aP0rvYؔ37k?t64x}jXc4QIޥM@&7ZfMEpZèz蝿JKl+wꩤ%翵s˗Տ* 2It09 k0m?aWJz#Zҽx^_mߚ'?^ͦWWa aF0c˫g] (ٴ].v BRMBn~װvkVU aLQ=GA^,+h84tWVFvumoV*Y$8-8Lf+;J+7bP*U4^\:_?~W?\o~z_)pMu~yWի}Ws#n:*qb_f_ _0?͏Sr\C<: a,W5_A"l m;DźV "] Ėxe{u8&R|G>GZ_'$}I7'i19k(i95/J"}IѠfL)H)Q p'}F@I(N-Q0."5b Tq:y$j+Hxɹiv$ӝ's%`gН ݡCwr]EҜMrA ¬BUfUXY-Es2rM#^>$(a)Ɖ9hRsICz՛ U>+shc1rPlҟA=].و˾d PP7AE*d: Ot>\ao jjq:໷WmCߌwчRdߴ` E~8R5Og[c^ÃyɣpTໍtլBr![ r5a:-5 >ֻ?ug Zb@mN?"A*n)|TRp+CDrJQ4/YJJ}̋}>vF4xH2Qn9'e5 zV.e OTfV[KIOSwԙ8w`NN((!? b~3\(1Lk [t}]!R `_2iҕ,ASX9ːcÄ6KB"%MhoޞNyۚLoKZ"{ q2笡5OgEiHթs=GŮsu>V"&y%tI6>Q=`0v "=5Q=@ ֣d޹ .SU7[@::CIt DISg+Y+wGRR23/(18-"QH-Z)ϑ W%Gd3q i >¿?;[(Í?-ҵlG]%ԦF|Զ๵TdKj7`vSu ^fZ׹5 <9TȝMF]$WK Ā!"C+(]һNT(?%ެ+Zm=]{e:PJ0v7;sCtEo+1Cs˕4uh>ZQ_wI:T)hU Deʼn#҄xT<|$WrW~DͦԻq@n=۩ENU۷(VϮ測{ uӭمGkP >U"LĒvl2*[&wIނO3Wנwpi%< C) anyz|AG$ UDX oteUY"E񾗧p yIɩ)7q N E+9>bo+n09RnW%5ڐOH0/Xw,fSL:"es= hd03MăZ$y4Ro10˲1C&1lSShs& )樍uH3Kp :lpL/54%d.2"PAt 08)%s HJCcyc?N9=ќvZؔB$qSt4Q <W&gy;|_'UWuh<=)ȩ}&[>C٘۵=s1ǹ%r]i1ZJhI9pVU\'.#0瑸u"-tv.Li]uݭvj`0~bY^W[DZ J<iDYAH[]w%3xQvU\@l奷UfJQ׳B]wnK%IkR2D'R(U<&Q63LWG#;T[k4V7X=5φ־pLͿgزEZ{~*Uhw P&RΡ^=V͕OW ǃ9oF*'ᴭ iEaS"{s>`N(9eý,hR fP:3N13ETޕ6ٿҡ/+jCcmEزcg_ډ:E@)n @PjŌD6 u嫗YY M#RLjQR +@ATl=N~SDl ha[0 ^И4[s:(4|J1{ީŤ_4o|z̰®JiXgte%X:]†Q ܳo6urSԧq B+s5KipIH!G$y4Ⱥ MR)Ii|7@W૛?af`V3gA,r">D{G2`gMT8JŐ&F]Z;8*!.TT,@vGEgO> 0 B" T),VJ\fA݂֜OkC1Rv02uZSȶNXa%x;M?OƃMPuev75ۥP*vԔq5_+[S9EɊʻsJ aik,`f1 ^$(4$1A1˂X̊B,RLzKq=GmӉW12#VP1g&HFlٍJ6,̶2B iERfWoRaŶ; nrM)Ppq88~|ّ%Pm-qF=aZM$@ 5t03 u( CR'ZK(IItY%#vHJ7Dj]I#Wsv#y*mu,3j{mGF-v'b%Rhp*m}싈<3"{Dܶ1b 8VJ tg'D(zIP))9O @- f#H#pz ",iP΁&MZK9&s ՜*\j Xp3*ef\=.uūJ1KxDZ0Kp`FF;$ qx2xku K5PlJ U}f#wZ1):lwpiAcWK,$*CA A. 3f 3f3fG 3 S@ XHpG5HZ7̚ i,ѡytȽyBap)U#Mr";JVbң4#jH9@{_3k{+O~ (g=Q~ǔaɍ2BZʱ`Q4LȪ9;8oLtmO%eOnU&„Pb{5I7>-lzCF[ʁ'c)R{!yT'e,0C0B0)Tb4/#bxمg#!Nۨ<  d%' 9k5hw'oeiF/G,ȓqô߯C8dx%3kmHaS4*f9IFNK !h-W{OIɉ}-?rDv5^(f78{#3 -t U=ݏEIhK^zacI]qx(00:f>>O0yvw#pX+I?8'K?\}T_M]( ՍUzpd׽ŝizy.ǣR]{yY~E_rʦe% Ƙcl:{15qŦ;ț~MN5?{*r+j-ުrD9Hж gYXv*7E2nF^w}⋍^MSS]u<|-{'d ul퓽Z0GIi|@:_^T!#_a*N[Z*m_CQeL-װۂnR?0@m{G2( o[c1f&CR͕u^;y +K}jD[FoVӎ|猃Mov$h- 1&9E&ؘfN#EJB(F\6,|#,a7<};1k ZtVQKbpHyÄF g N@C;wzi%&7+ > “Ǫ'BfEQJwI}l#F+$8VltkV,'C3 dB0J? NkcՌ,2@l%V'>L4f/Kcv:WՖÆ2K)6aAV2̸w(R(μq<(#lY)2e-xB?~@Zެtdx% qBc/?gj?1k5Gq6`LoW WE͗|][T%~{"Uc /6Q˔!EVcVu _iPSU^ȳgk񬕶>K`p`jܠ z=YGjɴT҄ujQ@8AY,-U%^'U qGP\~Gxə."$.oq:S%Ջ$0{;Kv[V-l;t5 ^\+4lG?Z{;܅kW .fޖF\2kP.ʐrQ4j4jP>׃iRbϒ-abfeTЪ0/åB(kB)06h%ѨXI%V`soЌNj.` d1`qX!cdK rk:$:A&ρˠ/S@w0]ԟfKƞ 3 `5Pd{,vuFZQOrvtz1KVWTZ oɄeB0gG Q{Qe,:udCEc 6tˀY,Cjnɑj4.\xe>:Hty9(n+u؁mv雚Pq0*|Uv.vr;Cuyh $nj1pݱ473߸dzҙ37]kY=PW :i|ܻÛjqc5<S'CI KZ*`)OE252vGl[lmq:YnrwV ,(_TP2]C`QLb(dm^j^wZ:v'W %O_GrɇC䗒K>va~ c".;t?yPC k\7{WFd /3> n 0݂GĵDIʶz0}#x%ȢDe*"3ųȸQN m[NvV%/4|Π `i3Xb"K!斢I-Y FK!/[([}P7{ S(9ᖲ.V~)q纡*pN~ɏ^Ɏz Qyi8`GMΗru4\u W^ /J䪙X-)Ňʄ2oDyeO꿂]^97LCҾBDp/tiύ":{r%/MBb/_o۲ػ('A_y\>)SDH3'F)NK?PZSg2&{3Z7t!X Rj+$1x9eކ:.]i}U}Ϳ_F~|{NzuGﯾF?{ψ آ|Q3`궧|VkYS{@EstS駚ׂղ}$S)J2}O.{ҢO]J`od|VO8+hf:2,ёScT<0@sJ>K,h`SFEL1Vz#0׳0.<6wjZ)7qo)gKXER] 3_C}*H?[9P[6rw׉:ꦾ_ÏM QqPP:6gh ^HvYC|+sZbW& e-)Q8v((摌(T!ʤn0$5 ϙ+% 9J->DɹKE{\ݳ*' @%GO=R2_8*bt v4v(TB7// zOu7fOGS)lstDr4E5# \D(} 4+_Ƿj' [l֙2IR &m5F%۞J75n]e1/SpOSo)s o.z4,zIjtkF>ش5a[Wܸzm%ޠ7U>~ =]To?\jͅiv4`:>4J^i`Z|XfOGsmЯ-O}/iâ9Շ _ Rnt(Sڥg]e2yspy |@t<8i7ЅKDӵYGkV][-D_1o3 aW%.WV3\9dbJL;cYd46ZssD,xgXJ:28l@}2j͘ ;TkY[kv ވ>={ ^~uCfr%γ|&ϒYDr7$e <ɃxT~Lr{YzkEƤqy~[CۉxێoR=OLv68Z=j0-SdG2D)[&;t<-d 9޼pq>r :+ڟ&a\gqa2rchbDchE>90pmaTS)zg`P^TPn=*$-HvmK_֜dƍyʥJZ(D2nS!1 V`c7&c  SfZ&렌BPQL$29X[B,cj/NiN mr֜*DzXMvr ѾK G4͂ΖJ^*MI;R>}OĐGۋ rsA( jr>z9Qg7Fk 8lZ[Оiyin 6)C"Qr LDj`}Ӑ~ ]jD!"UfS%$Cy "IhhL1 & /mOt*Ђ |8"gd. gD.LȒ !rZ8O2bP VD|BHISa: O~s?ObhZ'$ 6HX0Qph&G̈́&"Q/5тt G+ &7{.]oM&VғfWv]ߔ/"/ oGvD _[ @a(*/X\ah*.0`*t(LϭEogIrEDJ'Q;D0 :aDPo$󆧠RIl&Gpkqs.AE`j4d>jKS$Qi[i>_MiUËӚiYMwpFQ%>/ 2@x-g+ $%:KUv|TQb|I"\kV|~q&6q u\hô:iBk4:aP@q5@ (X}, LZwC'U+DVF-JY, IXl*A8NEsNNڟ45 9ɄhHJI (*˵ǩ7kh]PB4DфNU]CӟͿKS*$b4$( xJ!!.z1'UncR6v&^m-Yz~~sאy"2؂hb4;Mpcג_t]oø{͆ٓr%Vt?kelS8g$ygj2QBQNnɅη`x8q1\͡>NuU^@P'77WKxy#Ldl>:?Jܣ:}wW.%Q‘}3mZ^~}:9]hf%pVi!h+.ф|ң۽T?WO/|u;~vk!h=o̵e5.Fvim C{7,^WkW#lm5ˢ0`XQ |J=\zݟ<9ejkeZgTjPq 3 Kozy'ްyoB|P2kUXP=%_~7oﻯ/ߟͻsy{G`!kڿ ?GU۟WmX[Uc}Ue!.~[R8?tA?98<ȇ ^b?| b/ξ-G.ǼElEks.bu"Ѽ͉z5%X.!n0%wGښPOH3_"qh>5g'Fl1AkͬRN%Eo#~<5}PB -AH0i/D1AZݹN+:ږL|&0N瀝 y˒i=tg6).8ཬp+LOѲ6x&Rp}!Uu19oK)Hf-`ܙpEڅpkD.tz l@sJ>K,h`SFEL1Vz(9oYjJ7yR%F nK !B @Nzo ~wM}575&?PmQϟ/qu5jo֪yے~Юp7z&e呃!-T f(<5>*nn/H/D B"W-.P -2kJd>5u}V`xN^r^!:2D!1QFu:%*EKZyjkݗfe3Bۤ.=T[if~^aa O /5 ~$xT`vtx*7HjtZ%&,XA@$[P%HDȺT"/ɁXݎm"wR(ʽX9 [)Lpd^h ц=h"Y"cru &Ή8Y¬OGy4Ro1?{Ƒ~J/vO9#3Mj9e'~Ç(Y$%Q-4=3U_U]xDHCf1RR6!<82x8X睴iRyd$h#MI1[]YAM 2,!i<Ϙ!@#-!uȸJs`($p-= <' G@i;#PZH$t> +-m+OQM@-3ɸ'n{fp=cNdu\t㲜KU@-[Z`[MnuQ9z5!?2?=dvp{a!Xҭc)\dːedP<)Z̬*`L(}=d{# Oed\3d & ddhaa2ʤtn&u%>dU8j}kKdvN]cOD1iSwgc:jq6LFpծur:s,HS iޒmh)py`:ŁN{pt:#w٘sDErdIO$o"ϖ Gc9s*\Lm 'J[I Ie+Ҍ0#yA !,5Ta6TVjӹEKYR8~pU}\WXɽ]y'Ǿ> ?+o[ԝdr>\x}bOh_k?Lށ-J>k`*Ɯ\6Zq,Fd!9͵<ȨdkRR'abv(HIf!= |&զs32*հd싅2  3*6?+SZSoH>va>?ŀ֏qYq먅t]P | ; ςHސ X>Kmݭ+3c(ƞID!EMMЀ!e2vRf[s3bt7LCAjұ/jCeԆK≇h3 7E(0 ,URhъYI`at)"cUʲr+Z_2 _k?\N#&2I%{UJkP%OkCeBXuZekI㢮sY#YQ0 lj} )$ tAWm:7+uϺBy2#^tC-Z*{֪Eׂ[[6ˢv}"rûhE;G5*IUȭֶLoQ z˃ãy˂*s#Dٴ jġh`烺Ֆ8z/-ـFKf )sQ3e )T1 kW[5[4tW= |b6XOJe&@\VxTw~<ȝ8'bGdM[X}&Nh@n\!8!m4%@04mRCnx٬Tל'٤_9N7Vi:_u%xɚE3ϋ ~MQzmP*!!Ffˌءrصx2\P ɍW1n/"e5nw+V|xČt2e\*/AC΃Mj{~_y;bi~rc}^vmg'yMY6*[x6QJÑi,R/2'<$гzܐsp-,K2(aY@t91*KKE:ϬQ2G͝N #?  :x ^"& h,5ZڱLh@J J: !)82آ5C!C7MNe7›'pIVK5@(rЧm7 o3(l(C ȣڛU"-M6(NoA0~wߧLt uSaKWY{?W?_41_߬g?\^r%^4%h1ŰcϦgw2KǿVGi7kyi5kKNҼو""_}zq_ eSs@oF -Ĕg}<5>ЬNǣq/Jlk:p6}}ODShaKdy?CЉ5dXٟ͹VY/Tc KhZu)qlsrXomK*WM|rMQna[b/!oC ' ǵP8h=},*̽i\4nynz6 鋂mx >d qO-ӣhxvOZȯ^j+n:ih5FKƑ?[PbfR4hq5=paFwr<"-"w)@-.ܖp(!n_lW9u򛊐ts~OcmB4ɏ_i9Cn*d_0[ŀYw-I#tF|0Kcs3(wnjg{9~iؤᚗ<ܝt6zD-ҲȪuxPN}1o+[])E+}%,;u+}D _W%a?E%'fȹ,8dE"Ǭȭ1$Ʋ0ǔ|.U\݁/FFY/xxF9$1Ye2>cƳ೗A$Sɸ8F"!UDlWϒ qd.ICnF;͍zz6#Y7OӦDM槴I:o [7WXZ\kB >ʵ>Í}w1o;Ӭ[M1Nw{$1/~[&oCGӲտոA?- @@3?JlNfHl6J񚾠O?4 lo4ıWG\xQ^lA2qz6غMߓ̚%LGV.HVy&Z̵еYSSI[v_R[bۖ+tj֜zYztPG% LS^]mPLo [1'ϲlBn5^ҕu_]Hҭ?y?~ׯ]nz=->G?|{6wx藏FoR";t[?tD* ^>Y'ƅ&k $dPy5Dџ?4<\;o_U@VHD mVx?W1Ev)9!KiF>*i{T1p#]ࠫn5r4sZW/'i\O yI4sW` S^6OCq>{B_dX~9[ =Zܧ-Xe3|\ ]k0+/ꆚY襲Y;z@C3&2d%TM.J6#Aq bX7Fv[ *`| S00%61fkeBbsx?AOai٩cV:{և \.gij;1*>eBP?:?jLSiȱJ]e0%[>St>ϱ6?ij-e81 v~nS\>6wnC7 ~1wrq7y<Ǚ7n~HC%%=X &@ĝ@ jd|zJy\q"4rWup,o އ,4D}!\^|ϳ5pQͭOs摒:u}wgS|E>؎1ױ2[֞C%ݙvȭ͋TP͚݁=tS'o0\&ջ ˤ۸vۖ77H]rv~8;24Ǜd]<n%w ۿy啧n]|;k6kN rX^pbwU?i{&Od.{BzPwwt*rɬ'-Q)gCJiU@6 .\][٧D{d&S`eC5IgP6@1E tȶHj= I%q2INWJpB ǀ"t&DX[B$ UI"4f8+j ͵+_r\oSMv`Y>exCWOñeK2  QrHCM `I>% rg2%W)4q_d.ϗFdoS;tK&8Lj7mzt)}RGy[Cg1*Vc*+WhBtB&X6Z%#M[䅳sӛR{4 a6@;0dX=47aX rDŽ40V.AL] :9P)n6V k|*}xPH)R@Jf;3 fΩBe<N[HX& amc< A^nK;5P6" %ZOJF@)$.uN濩9qUCũ}<0Xe\ VB%tY/le䕯.l/.?f-1dA\?PUs>eE4, %i,&zU&U1OA&W3wu(tse\;/N8_#$0 ,d]T \y4YC//&AXo >MD3:+AVYՋW^[,[l`2k"YؿvKB +ժ8Q%PѢr_'6^vaK 5R d_?b=X9gNh,BTd1H-j1{[S[g2絯~XA`ƓnzM.,0,mdӧy跒*qb~/zxαBL JGGix!~ u;V!tJ_7[O{4M ]][|*8G;wMi8|eެ슃me0^ M0lI3nFnf{40ߣjE5rg7sOǗJ}CnuӳqFDf5R'6>9Hu^eT=py]f۠yQcvO{?_釷[?|Zj[M$Ǔ OeySgU& q4=C)fOYM>-SzpCJWRiWrfzbXaew`/ Z o!`@&zYf;I?m4INF}1@B=5N")0\BrIXhEf3N#~<5{Ȍ>I[풣2V ytߜd1aS S`)\ɹa≯$!;/>LC׀5jy4^oW|)yNtSr[NQQw,SJN\:ZHZ+-僫bo=ΈҨDhw4 kL=9nwW?7}n@4Ro=}^?KN&mKkZk7Uᄇ -&UOX`ggZlQw>:O;o%iI~oO.SH{#ХI`WtEёPW3֋\m%`@l%I9gqf HE:t)`tQk`BN<`6X>}0jNk;>훐o ^n]?xXUh3A=ɻg{oqnR荋ޅ.%^'v^h[ݺ6 ނ\ R@-HA xc9muE*(QS. KmE" 6J$f># &aOh DL䘄AF nmG6#g}Z>0:79n^0qvP [_~_;6nIm3 fτW'pM6[ MUr m=$0>wl[BZJCw囨b0ڂ& F)E[Vy} AԺ6zHNvw!]pMƢ_h($$"@K9Id3 MEd l* M$t ?]scG39쏯$'fm1L9FnIE@ mDԝ>v!ҡle ΐ  Y22r|*P5:Ad"#dX)6Ex &\C3dexH!) E1&[tф "gs:l ꚑڡ4ntU:|;j/}?b;d9 >Lj!ڟ^M?ƀs&]᪳(, [b\dž4tlm;Ve x_PNyBx3ZxEq.Yk sCD%dYONJfwmUG)ҳrԠbN!F]U2_4FQg3roݲQS t1N6}$@% RMgy-ЅgJH2ЧcQ} 0&k$qlޗl OIleU(xh$Q,43S]:RhD[<Q T$H3bF 5D1.h9RD5rv()w6Ό7Fgx ˵'li%; ~klfqG§Kyߔdڛ_D9*S xD+BHD} q*eQl0D#z*2J~9%YNB}P!PN R6!HUZ8P,-c!ApAQ2kƯ64Y,oH\O?iY<2b)/d8"Bhg*lNsM h˭Bmv:4OÐ=12lr'!(ԡTBێhԧD02Wَn8+&池vkq(jC˨ jw]zT4DʚJXF4Ch\h`$heh7 Kh!fHhϣB x"^s1$'$G:*vَQ}nǽ9 ""hE"਄HFh-J.(pŝibRƘl")xm6%n=hrBM ,ejJ# X8*hI3PP9_ "gS'e-\Ǵe\.vx'?+E1HD NZ]Jɼ5/LR, Jc(L'xQ;?ccvuw:(*o:'~TBQݾ爎|lQlVg/GԃƽaKorB_8-C\KptoAeX{JB-Zux5ehr&sT!dA϶3@R &s_ET*D$2*h5T"=ɦ(1-k~s) 6k`ئ ݮ{z*_JlT4:zg(QQf3UH$$P 0d@jN{k9+GUq5x'gРA^VܠuwF]ZjꍦyK->KlodɴΙe+Ibśݧ d{YZYAU^mBh,PBDń2,aq“' \tфP-qo墱Kwv g۷gw<7Młn(?NyGBh]^ ʕdgłurԷ+0AEKhAXbh7ur6'6Dsq0;c$:'-RȬӮlKmCA}#]Tm= Vn9E87>_'ߺ8viG\Rikg/amm@s7X*J *CSTA}W{MˎhfP&TKt- okf??4n8]ͦsv^|bG?IyQUrJXmr\P @ m!t2DP;Twy?e%A\>=w.̟eRT>ZyĈt4& #"xA8+Eҫ\S88IآCA5w`,(d5+$c;uLHAIb:sp2Rtg\plF|Us|_ҫ j|0Y~U؁-~M}@-yxQ]Aſ 6 aׯfzrz/_+t+c\p1,Y#y/hfQj0䟃?80R fEUCE?{Z^-ET,UYon{9\>!Lca2͢h'^dߜSp *'b[T~?v0z;_.pGiۋfgy߸_&ⳝ ;}n/Uo}Ģ7|&hm`^ĺYǶvoor5pv[)\(vyP [G^ 1 8g˗?\ї />URΏG?|OjHgQ*!?Wb8(UPmoгg%lx*gBH࡜G;p_a~j}ÞgMN5?g'5B)X&Wes3}9F,0%IQ)3\6DW^}?6A⏍^Mc;9p8 )GJG-Mc哽a~3K(w%%.IGajcs)j㳼k۱`7<\O 0j4la <}խtao+d&zV&}{}LJ{{N++}&nн(\X9NQ3QQ_fQkS$@"yC4 jÂ$"u>92!tPaw<ǝ]n_3JV]\_Wq1[N;﫢`ڢB-f5x0A)~ӻ̞WaEBp(Z*֫/'˃ k\ԋW$(H9w%X#KcC*dS @gfQه&HYL2jz"+P%Z3+a`Ž&n5ɻb:-*RFd &BV()K$qm:m:7Rp\OC\̘GR3a?ʼ4T-pɃ0Y$ (ىyX~-_ni)<1 \9,ٗV ʿWg䙨w[[q¼R#Θۏ7쌹g7`:]U~#갼K|tV#J稳J&8_X6|椘a'ԨWQ8yJw<\ ^5\hO>w$gSFzW aE;BǬ_wqK~l }x7$SúI~.F>E+.^W>fyF?x;7n:˚3õֳn} n+ɮzk͵p Yq>SWkذX-hɒʻ}4q2a\~淽lƟ'os2 zX7^}x32I{9:E*5V΀E긤9iԖJOsEqZWQq{w:~=q5Q;Ϣrùͮկf?z+Ybh@)q ΂Ҩ{:Dp04d+uup0gPф7NQBj=kL(\`>pS@!)~q줠d`?]|FjhJ&*F#r(D;MMgq2 a7QMnpR9&š2^\t M>|(,$SjR"*0"R9Id-noa9  3NcbLRf1pYJFP [*^jPц:ovM6P[ ʕ\3ER)i8!oifpWOtr˜c 6cf?H9q7mLv^Eo{nPB]z؁U7 +u ;$e6 a[o}B Dc4BH,PѾp: 7s"]QSa*ZMԅ&+ Y ÄQ)wK$(vCHc@y\ K@T8T T&/Y v-T gJAQ!] QQݬ#TDRl m?݁7PQcTio«q޳.BRםZ(vaTT0ȫ=լ[+!ZҠiD'fNc d5!qʰ&QkEHBӔB}f ;܀yrncv+R5yldc,HQrX!F#? f7s?ömWIܪaCH 04]Ӄ擆)#k!p1@T?|J` JdJh#F HìcZPjBoYC /BphzRELԴPy !>DR5w8 WXq<ѼLPcj $kuk q*C/f,8PFuLUĝWݙ<&C_I|sn6ddvê.NRWTIQ)V X;KK~9h!/fs7DRxYB/sM3~Dihʠvo1aJ qK9AhWKE)֎ n:!z=|A %[gcіU3q5qTdp9ZN"<)j+cq63kiiE'Jk\A<֐zw7 4p}}>JO? ~ݣxm[rQ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\}I)m~B &<uWlOb+ws$շ\9^+J+J+J+J+J+J+J+J+J+J+J+Wڢ=% t+67> G+2I : p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\}Y ?<) t+ pk4No>k%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p5+pNw Ѽ8x)峞(0lySmM(8>a8.l>szP}AtMr3s:EFM/xڄƭ?:q cl5v7A_@\TyܶyuhJ Y"[s^WiˏڷkO/ON>mͽu׾@C,XĮ׫hX} a_~}?Ȩm~{E-ח Y|'z梽=@! v)/jي0yC9ڌ&LueSNSJwSI[#'pb{wژSu@7TΡݮ.:N+gWߴD/l~/wf-uyr([~,Xڞ{'͆5eюw#*zl`E݇שT| . z)ɚy$boŔs(Y aD1Y"wUs~L?^K*`c5FjY.s?kkjo$r1DM[9X5cf=llrotp鐻 f߆QcJ)Oi6ZcQeȒSj-!0gߨMG߫Uomf$^DWbrv]es"+\]5Ԣ铤OzruÏ79!jwS߹zgWn6ƑהG(-Z=5F [w2&f\Ag QE$q̒8V3tc⦼q~>c\P1xYWY;3l۳9KfD*Ʃ Z*1ZKjPU7= 5gSȇ^&6όl<.l.3_=#L&]}F֤QI]^(nITrX$< [c<ёf-N^R99ǧ7YLYl>UgjvuO~wӇ>UeOW^Kx/,Fo"p/ K=C3z&cGP,@1p@.jmr)lsD(2($ Pj7% K!޵WXɐcVQ憹*k&;(U35*˜U1MoFў%'MTB/=Sy5Գf8Mn|-R*w,|}jx? U6F9HKLg`^ƒiɽppSn8}^Gti RQMjMKV),C8-(־5i6ঢ়R=b9 K9]-0+bFKx 9F-SRR G{9rн|=FQD c8[QcJQ9G+u;,5S,jCQS-Oź:q4<"_ٞBޚ8U[odaЀrn#ԮnZ)AVc$qas2ֱȼ\+9V'Ew޾h35aEl92A!:,%v gO i S9D hai'Ɓh;s  ?rl7X9{<Y!T|YM%K1ZDUɻ.pO(>T1HTc41C' Zy$Z0*YìNaVm".ɜF^b@*XMpA谖@J{BC$FG5袑 !;^*ןL?`\ocUГ@Zrl:>BdQ6ꑟnP9"Mq7sopd^_l8g: 6!rCgA;u=ާAy^ǚTo̸aϪɴ/jN 0+ XSu `.dݵ-ƸxPs| v&t 1YV52k)f>M7z^4tdX=A(13@6://ce3Oq@R0݃Kzζp񑃺}Fѧ (t3\7W9T:+=l*JjݽM{yORPBa4not|̻yPȻyz-mͽ,$zliN?-GKأ2_Y~-{&H'cu\kp ov@6햷YPwf0No}?p,qD)uH qr$(G,q%-q{؁,q{%.䰑ԊJC,sx2Mڈ4Xpc(l8 ZugNjk#g= ?~:/;j0a7EPl74ja^/2 8k"[H|-9ixg¤%IЇ UV(81P{C@hgi%62C o5EThR /J^;%x%8٣9nb9EK%Y>Yz(黙dIq!L*E$&rdbbIHa$ᇺw>nբT݂.x ZQncIyˠ}Ж_sS\^%%q09Ɓ*TSr0͌NaΚy3/3BuZ<|+TjJ30l(W0?ݡ0J 4]M@: $,?Mj6uWx>xRO&نm&=tZeOxT 486c d>6;9kb&F]F=_B޴:G_ͳЃoZl8BomNxUYp(@WWqku.L/9\l4, Pw::TKsR"%S6ӅbK> B>ir{~< >뛪+ת7&/j!ŪДfwRt6ߴdy V GXRK>^jYh4O<2z NPp*1ᩏ,SpZLy+b*%̻.(O<]ž'.)JHld)BZ!̱*FktN9r{@v~-rnl;DEG*G .im1CYgDd_{ձ3|2yΓ悳 R*DMqX'< xoB'P >u2vD26mޣ#͔mu5 5K`;Y| HC9ч9'6"H: B NBx uGbw#1Hl<#1Wcc,ڄq1fyKN2-bq9`S0>)wr6$ưsB5H@Ddci|Ф'L&Fa pGՑy1!5Lc |P'BVtIx\ҥCΗ ȱ&x:Z\{ w9ڸPN vcY*NX!,GncPI/EG"! "E) +Υ5cwi[v;{ G%&8k.^=}:mލh0_~.ie0O*4%fnv`@GWGM^Q4y}1M9W*^DIǐ@+h@%X MW6 ǥ&YjAԎDP&("8J#(ATXwv~c>gD0k70qm&|}0\6=Ոj2i"U)_-)ҭ0a+٬A# 78'dU)83ä#1w{xo_  &ǜPʴ -(a[dE#(d @V)͹nΤY $D;>*B@֌w3l8bGh+^ ߑzXձ[_.vSʠu#]DdƶzWlü(!Z ˽j 2ײ"o:ԵN}?^pŴ=Fq/6 !h4Tj&6DE24XhD-1̒%c( 2&gƀK+c($9"=FkuNXwãZhb6հ2frT|_<9#9bv.Kg&A>kS6 )"ߡ$!݈~$i'DXX;dJh$M8+:s r\#FpoG˓Ff M;DH8$# kJD`^)ow(  47;3*_ok(׀7`YϝG1b*E1'ٸ&^52G)@wmXtS9,h)DːDph$qFO͉"6bnho#-EFcۗK\Eyrkqyt2Z%Z͇8'N[IpmKOt˩}{g\TNQS x VR T ʢ#a^158FRi(2J~s$hfS}Q!OtN G-lB*ņqbXXlf ya,  /Fm~CZ\ض;???Eױ==}~#6ɛ2HƍsY 5^,@4w^ĐD[n`xoT=<>wqaK%8 A!R };O_L+]p#Ê/]lv0jCE`7x!RDq*kv+a"$h"gs%ĺh4GPв(*:,Q/gXb,h.&r(>haHeRP6$uy8;޼GiJ Yu9U>][}6];>iqCjw9޺8Vv'"Ygi? { <~UV[T<(H|8&dMA$ܓ/d~ Ew6ש"9\6~kOEjq OY%VP.-ᬔDC5t>HpnB$XlQQ!aW` *&Jrc%$@)Fh>rN6nrEq,[*H\"# 4d!yPLqr&ehw*GĀY@/]pvBsUo!ϯC&wE>9ݭQ"؁-6w}j;|Y>sV"8N4A*yD-(:"ѠyIV LfPg-Tܐ%o\NQ$ `xP iNNheiV "Ot<)@Wό3%]W'e+. F W[>]?VX'g{b^klvWQ8d2mPw"E=>,ėJx6.BegっI1cqp4 HE%OD" aP /ӝZ<=3_eݫ__G'/|bgGh߾h!|qhݽi]b|.O}};NW>oJp SЌZJ KN\["B_^}ݼ_ !©\W#mBLl}<66हh:_̱9ӓ*or|8`>ǎߍ{8Oi[haasx&e$'/~7-2ͱTiѼۓ2qhcewaZvooru_ v9B(6 G48_q=gb5Ns?nrF_8 MS}: Mu}6]FYk[z?Lƭ> Cs3Z26S*]B|6=Iwcg1ujs.IUJRխ{qv`frVˎ&),$Yr5}ss}_A~K⋵QM3?L#h3dl}ff$7%҅ }=ln|Wc7ݖ>y;]Nlt64/36<%E3lv3ޡǼbtiu3}b7|ysʊkQb/wfxr˕3:ڨmY9NRʨk2è) I H͂ڰ#!!B="[ݞ/FAY/x[Y%C_yyZ(!)zid9"򘨢j\e@/@~d_JCD,/{K\srLь %*7zVl'v[RVC|io%C it&Kq\͗~|i /MKS {|r$1x* ) VS+@ ֋)Mdrʨ\ٿf >Feuwu3=$  ƭҠr"  X{g5k#E  x]E^koǍ8j~!pߚ8zݗCw`WE`(x+}Uӝ}6 lޏSm X"s_^_F+~-ۭ4aQw_vʏ}efZ^mesl-5~A?|,VY$^xp;9Γ˸ ~ _ N~)AsmbIhz0aTg7ոX&RgCf.r17Bl^4XZN,ƇN|\bzo; $r] ؐ<P߬udFƟjMNP$´G<Í}sƀ:eKL{+RF OL3,lœ!ӗs;/UzB<.˛&6]qFQLY247S.(NJؿރ||s2,pw]"oFWuY8$CHw 8h~9F&yC;xw_ީ˃_Χ[d+_۹+,!fbJX'cTE5{j"+ypIFIу?9'G(#棋M~s7x8PJ BUH iT!*Q4F678EF BUH iT!*QEzUH iT! UH iTBUH iT!*Q iT!*Q4FҨBUH iT!ÇVBUH iT!*Q4! >PPE*HE"UT*RQE*HE"UTT*RQE*H'w kQE*j.r"UfU*RQE*HE"UT*蕗rx~kLV[3JJŸe$QDW!&VR} 9r0gQPwd]5uG>B9rKҷ^6̮cO甁'D %(񔄏ʲȽfI2 hѲV(c{LwqV<EWX<#Sܿ#x,KH"hة\K -hqj'ނ$%Z9yX;wQi-4Uk,#DUɻ$.'[mR$v7Z !S F%S69k5I, @갷$Ld*qDK)2W=e!Rs# )p.=4u[yMH֏Gn4Ӟ$:ᢓu^촛v3\r8Cü8F}NGtz4! u$9\8oZ1@~84c5o̸I5cY16N?id.dso_v|(Ŭhy±/dg=#!FYAd\Pȴz`(u\۝|2!Q=*^iHNWv~5b Wco㣏z6=vkdΩ,[ 0nk+wpK"Oնy۔\*sY;vbk|k8E-7?.rvѺ`Nni~klvMFϗK/{^iy>& -n̢6O+9t!a%Vu&65lmYK]JoB.J5 bto0uvڔAǶNHu캔(uveit:S-h[Gm9c$k^wNܞqىX1HVLjWd4  KiC" dc47YHlE]۞UCkOǃ.噭:Kx²s@dϔKKHZ6Ĕ*$qFbkmXeX@ #C'8> pb} S"J[=HGm[̰hBP1#!H6X"Xk@ W&3UI" nވ>{٢A^lko7sUr^1hYpܖܲM2  R5HE II&ɣo))é>SpYͬ\e~^b۫zKHE8-_[Y^x^:WAZCQSƝhA!FmX SP&rN*z%@ m^XrDrty'y)U]o·aAyJzWaX* rAY<6ɘtآN׌1nI`PP $FX}MmQc)Jm(q UK1!JY\ PM@BRAVnO7FR.:%: #AD!P)Ƞ>ZMp^KRVކs7{/w{)k)Y |P:bhws:[kyz3l->dM/;y*W r>Ḑ ̳6x@ 1.hٔGQyqn{]b8x#2E dh"ǘ20{Yh`K|D +\DL NBZS9\h*yp.u|pv䳎lBN?/vd鐪ƠXD] #4쒍^I'#Jr`>Y=%>#?Ee7|euV,B7&O!h2"cAOZ1~W d 1%8L&I/^f^z6^6Mܧ5KGWl x'HyJ3Z'Q:ȢHAPiLiR@h<.0e,"<3jk/^پ0_o$r_=3߲쁀)[ P  I+LN˨n r&Vn`cVI;zr5Ak4 H=J5.Kt@:,gGr6QNٻɪ:d?{r@QL.QlI EEE01$Eʥ=+||(1(P k ZdT(W9 ##XA0R2ID5"[֖\&eѳ ٣,E6}Ҿj7'jG[j) :d(uYXYET):I4ULBhOpWPUkb{$y,夼#e775;B46!xV˜#(0#Ӷr7{1ǎpiRG4*ۍf¬=$&t:(yN; h%[>e6҅h_6$? O@J4E2kQb I:;%9:":>$K滔A8YXh=0RtR,GլR. #c]H ,d мvOYc2jqkhkVxWo+[CudW?d/F<72(^6IUmtL4gTx/i) oCf!ѷOa>ƆR1 ]f~2JJG49×ҽ\?i ̈́v679 V%٧˿u{ѓ'G'GZYHdttvy;\<|Ջ,+^WP_6Wht3L`N»ѤGǯ3rf $Lh.λ0gd5c<5Bl ړ{:֍ͼnIaɴۣjE6d~9ѳ>G'^^KnuۻJEf5R;L>ŌwTK ϴ<Ƌ|I4IAşRDN@9TeQzFE!BvP:o+˼,|8wz:rG4ViL8lI9umkdvԱo!m4%T~;7%8[/蕌{R=rx0H[l:%h҂Ɣd,;SBl-ISZ5= C,b2*zS}j+a}WpV gEmz@8T#fyiM FKpG⬭ *72wz76e~{ Ŵ$ ymun;??{ԭM l|W`.|hSy2c ¨Y6(yQwFaj^1=Qv,P. GD" JK9cYY$V!J1K0I tEFm `Cew= G6jc}MxOǛmvvv apK˼ wo? ``v x*0.EG[V;o٭YFlkK)̖|9n- A{j(ڐ bIH31;0bI%:2YYã%=#JvD'Ihl$XߒA)68S.RR)1@ǖUR@[w%ZooоݥPrg#v=B2c:oeT}ttc:&k1* k?!JqVh^CҲ!Ʈ4ݐZcc<9s i=R#Г1>jCK0ndHh2kGגx3.ŷ8je}RtzV=dk@YByn ػG-p/r[d!hIY.ȥ&E%d^O^XZeǛ!?Oٻ6n- S{I8|{nbhO[YJ"F-K-',TP眏!#J&:P )K$)P5"ʱ[ņ(Oܱbbop@ 1`c $ryg(Q6%-|xK-2,;5 E? ?h2D1.М*N)}aE-65MMid&el~[O]5^W^(WB=僺l)7wv稼A/O-!E [N,j-Qh$Q"*#9D5z)TH1ǠSbQpKJ#c܍J1,,&b!/bbO33~uw;f>|8[?mYa8xTmDdrP1jf=Yi!* &..\8 س,&KpBvD#ИW0b݈ƣbb jmQ 6TԮ 6G"M4'&h+Hˆ& rU XJZMؐ߬hȗBFfHhϣB x"^"q9['$G:*k a܍Q]Oc}@o4YQtN8U'~j˗m=ѱ|BkQao'GAP\9N[.,i"VpI%5 3sێ!`rZ[X :aEA 2 r\2g`AJS s hq @AkXGdS,p6tZ_\ t1w3u!+lbv;#:,m}T pb $L!.(^ azp秓}1>ľ0ě7jt=%ϭdi Ñ8.QN Ȗik)g[:֒:P(TͶ_AѭmnBhEäf *&6$f ÕҭP2'󻷺ng8xn\;/J}(`PUBr!NR?E#hN&RZ ڵ6mdRgE:ZO<]h=ݣz$jg|N,$[$,iHr$(M鐌 ьRp+,, Dn/!>E3+'l6W/}lz"|n. ~-D^uN>*|i$E}bL,9A^ӸQpoQP槸E E(nHO-QΧa u~纔 th~wx/T]&':m^43~hK떒Õ}-ۭ4iUG3^w/ⰿs矴WR.}: ~7hm؅?E+4 Xc۫Ͻ^l E|lc2{&(AsmbIhz0b'55O|\"`x고NvSF>^9MRffwybiWҭ-Zq% HJש7Z(F%"9Aǯ{QF j|6j/&=+8^xV%qFh+E`mPNL{j:KN+ y.v{X {Q[G8U[+t2Ep'8hV6^ːa&&u2FE\t$+DkD>piE(^]Y=w xr> ,Su`޾98l^iΛ7vngN/~q*: 'Dx>+8 JI38^GU\DBi ).. ">0Γ6jw_W/?.8נ/uܿqև[S:? _7___SqB}|6z\M7REQIjT@ Q5*FԨP֨PsK}s{]jT@ Q5*FԨP`,y Q5*FԨPjT@ Tq]|l/ZTgq q4/*@j^Em['A\K1STaGB >"Tv/rz:37Q幉(~aV ТRiJec`A,6Y<&o|QmKbbmb2'Ei½ǪgE슡e\OPck^)VP,UTÂu-\:۞2oY+p/vq;vڮOd XyuUCHm<2 ZK$Znޒ/1MUgSޙ oV抑,3>>y΄k뺨{=y(Kxݫ.;F-qu19w^gΊ@NZ%8\D*E `<$C -<)&C.&3PBoFg.{.AoLmb%|n#Մy"F itq撃~c%h QEz| 1FeZÅy7-jFRLJNy|f<ˡӶcYoW?{ƑpF~w`u~8`Y"S_Fc1#r]UꪶޙoM1c L( cFjƠp9KOG#*t0am*Z<0nyJ0D]*I1i_- . qn8Jǘ JLҿv!َ =\ܳPK7kBh,~^q煟Ndz>C6cnIJfv) Y4ruGE):)9E|5 GIxxVku·䁌]d,gc4ơ ܈$Z8g҇ق f:n`>:gcHIse+shCJSx[3 ׮"2+y t<++F4V-Ykg _ņZR=/R&gS)KI%0 Lڈ*p"E%ؑdO9cOr?0,I+xl!BUH_IDN&R*Ő>GԊtf{',K˜lbelX6IF4K(#3D2\z\MҀp:Yښdݒ'' oG;yDS($"BNrt\>Ԥfԉc"=rXVzCa֖ȾgZfiYZZ3=66>D@ t+Wޓ}sdcؿ9hme z=,5$ݎI4%:ҵ!( ((`' 6 փMAV% _[ @8uP#TK}3* >r:PZACɹ:S{,P͚wL&ɡ@)^@2GP\ x"GAJuBYaqIHC'E pD2zPr )c99HmT9;Ϥ!Xͳy-zפtQ"ٲ"&em0R)A3oсI̳d.P:h#UL/u0;wYT !7$er#2eN!#8=1j >⾔UK`PD H  F#Z;+,D",[Βdi^%DdozK5,=""i)52N@"v!ɂf4GFjJIJt*G Q(rpA<`5R;zG[3U??{| 嗧Jk˚pu" J0pT?T8@%7y!ciTax#~x"w ;%SLyVJa/]$6gHʽ̒9P20b+8`)Opt8"1xyܛm MMV.}w9-.ƙigO׿Kcu3kv\bTO&x㤋A6dl}hE獗!~0TltK|$}vAZw.,geל9RF͢12cI !p9'}M%B(4`t`*z+ -27cO^<&dt ]xie_B_Z_I>;0.;+`1tiBwqrѭ]GȒR@QdyK 0W-/$,_p'h})ϐY^XXugY4ӧKl)ʘNNf`[Zn &*H1 Z~oϱBU ثFԮhR=a6Ni9G]^/U>N ןo狖+llA-@ZV[6#z"H% խYy4¬ ][жEf }BJJ-lȽ@_ډAB&P3Z d:sgcKm' LcUVL(,}k:XeZc&%c֗7|dCx%+e%*e<(0>UD9hexN":tvy?ȼ{2} ;4{zkҊ,6yMZmsn[ |{oȾPCUGyO`IvP3R-zZYҔddX G a!?캽%tsu/Z7$A?1 :=-+L$KAMPRdB| zD2`edHLЈ>ǤD %g291w;t2>lDYE5720.Z0BF2g1%(JšD;lC\ݗ;ջnd9\7'z_uIê۫,A*(agp$)17Gr\0 K)5KKB]$Eлq>^]^T~5 !UiT[X\*اkS s{uw ;# } ~T. Y@5Iu oRb2S*"k2VOJ4u,<-p KL݂{:_48ְfXI:Sok#I'|c6* 7~֚^Y6T Ne:5I?o#,re{%:Ldi"]HڰOD=W5.~n.Xl+/4Ժi:{-lOpq6eM-jݽM{y)qz^hy= Mnߞ?.knQs>閎'EΖ7Sp7DϖMw_vtB+Po [=mOi\iDvatO%? [đ%ydtpnjiˬP=qGO{EKrA1D Afg!9blZrځ2bn7~brڨ,4;/۱%ݹt?Ml74jnj=:YlEONjVȭLhϲ%IgnuQKP\O_=%jxXPRYMZ6NoIƉИrgUБ;iRȌ+Ӥ:{aHʸd$UW+3NMrEDo6ǬHY n5jIwᒩYӅ|bK6SGzv!4]%|*bDoobhv?YzԬܫ䧫2PVc%Ϊt7#W`/_zűLb[!MXҴvEc?y^ $NN@(<(I4JH1g&j-LLu@>/w!%ϜSR(,sJ{UN:>2cFp;6mmOҸ.gkj L0M9h3] ,/>;g?NJ1ܛs(KR5Ռu"Qh#%D.bVK8z}un^{[c\f0WoƞhkIil '>AgݍPj"M dջD$%1p3 gKl/ZŖ^IV#}bJ=D YlP<]6fX;p-\BBfd"(IŐX1fHQg\ %e)xDbH20x{#& Ά0Ej9-opUIEIZmkSVOL7_u׃1b:eA9Ry$ &bmDMg p^!u8nʕb, \t{ь1eL[cx7)X:hIxA[Yy0 ,b>=7W=]wvN5ٵV8݅PnZi sO0gXW}{ҰtҟE'jI`\f{Nܗ>`Z"џZE:]MjkZ[ܬv͕tʚllz"FhL vOӚe˯d[9rA|*.ɋ"Or> %pmb]*WO\|"/I9T!z#m&^1$El:g h1tcZu֧ Ьuӏϸ<^lXMxxлmۤvwx7DTO Ap`2,>&~^̡U}t[Tidj %hvݡZv'j 2DDI)J3h䥓cB3Yf!9oiODVue/\r-1r\(+t(a*b`V{)R2W.&ۥ+\ޥcprt^"۸7Lb)G[뀗/]Zܼ'|,&i|i\YsWOO&R%#j ~q~vH9C*-o²˲V|j.T~ŒF-wtck1wciŪ7ZݔG6E£m zj|ſn逿X׫Ijc}[ޒRt1mof~UcN~,2/)͊]`D:8vYZ O⺘NVu/NW=`VމXE.jI!oztpQϊtr՗c~P|[]w l9V)wl*KDa:oMz۸/tX.pe [ ]ﯩt:Ϻ d){< ÄxR%!34wco-8KB# *dN</ >su#r ?y>?wO/h4~sK^6_~:h'6. gi?=冝P=ʲגZ_KkI|-X$׌%$גZ_KkI|-\>s3cڴ$NޒZ_KkI|-%5LK°Z_c]-%$גZ_KٖגZ_KkI|-%$גvW5;fZ74٪@cKNAYaA>oY"5KCҸ,Ew36x3Œ\V53C%'<18/rUXvgij:EJ Y)Yt;YJD) ^wV 7='-h!"=bZ%.:  lJ-^ng}nQ>)&ǩ>t?tw{_~Ӊ͏_JbL*s?{]w4㎗ϳS1_q?/<=.Эϧ޼we˻ /cmՄ{ݰ5|3}m#P؇V_NB2P{ޑuT,}3./ucQ󡃒쾘'?Q%;l晞 .}gY{k2dNYHp;u7_Eg1N;~;, G χ<wmLk~~渔-=B}ܛ/)î 3dN8M"/qKVh#;f{*P !cr̅G%AlYلz?M͛-c^&7p9G\U(|]H\r{_Z~חV/^J_> M!t{2)$2Ͽwq4/ r 'u.RSY i+&e ;9g.ߓ̟+zmmhiPigʈG\ak>Vf'3AD(R,}:!G s&xDai2`l{ݺDAQL@o J988CF ;,TK r /5n[ 2_5*:wb^LpP96W SdVXɷp"E0ՀvK{ծf6ȀÈ>"g]k҂ @R b)v rsO*-k^5llݰ1Ҷƞhkj'أEsr|RӃ/` ә:ft/-Q] {(zoԄ˅u]{D {] JYgȟQ"9q򀝈g"Lfq6+UեY 'L%DPA;k-9\FLT^ &t* L ӹ2~6gzI7^x?fV>~>rp{hE=ch[/\l牨3V:g2t8%SI.>و<[Y,>;sQG `;v.P;ߍdх=dF\dːedѸ2ODt\E NzYcUX !t w)鬰a;aG~g-F9y"2hi(ܪkGS:8ifUл9W`99gz0g3^e.7HS8;:N3r0ATϓʥ,R)DfcfF%"QÔvRDm5RVe&%VYX_-w)E.18@"Bbt&s#=_PBW[7B7QY?RAde #\)G$D&lq43>gBJj q5Y@v!l%}V| b 䩡h VvCMqh)ggqqWBWӓTJ\!>dx>%Z<7;H'OƞQꔫcTV'ײRo]3{181' 9:!V\$+G">Y@HNs*2jk2Ҁ,K.zR(Lˎn97)sgXMۑW4X(+clXXx.QŒiyp9=p8/ct嫀|afeQ^G-C,Ƹ g6Y gae Q0(() PC4ġL&ߎYSҘu2 %툍Yp1wi}Q*64n &' <3x-4ÀH T%V,1YI`a4Xeu\LC#D21CE aML,h c*$N*@Nu2Wj܎S_2'Ri6DEDUUCĆ7WkIJ%d,$ < S>uz\rƴr L( ``!餹Ɇ@CC(jWs%&zH8+ڥjZr_\4qQ7\lxRUD)jBe%ġiF"xx e#dm3xXK;v D@tq8IG;b|C͡01Ap-w׏~qU "x4/găy@{<$cilol$Men5/BGW殄etb1"L^rK+N` tbD̎1|U,?j T%') !#oDV&n#ӁW,8w ZTh(4s=1c1ۙtlD~4%3m,ZC2$KEL3aЂ^mP6NNwq>;SEt¸/j:]úDNjY 5?7я0sF} E\AcGYH4N3.7k(-@]3 Fa&r (rTAddNpХHT mvf%.G~'ubKwϐ}ܲwsw҂a4 Cuy㏯k~-ڪó/yU0qJ^kc;6u0 t 0EO#"< "]^q O JdRd^Yp^3J͉ ƣ I,GU+h[R#E+u˅.#"B+/ R$,{>.TM;.ƈ GK呾ɇ;t6byb%7\뙽A7o/]fT6 !-%Y]+s2tA]f}m/AIs&ʕD:GZPAt?/_`x}?#޲ҫm i#t˥_rcyڂ3Bedf 2{]LfǨp̰ ߑOڧЩlPdIVjIXT+˥X&@Ջ8qPj6O,G~N?eY϶qyEjLn6ȂsU?ūI4IYFcMWJ^e|g Un yc2=JF`YD1?X! ee]iY3&YRĆ]C˘ɇWmQ_ukLLn~ O'JٕQ[q2w}^@L իM-զk2}vVLv̹:xwC- 6#Th[6x[Oc~[Ϙ4ˤ:&*L߲Pc SIS 5;Tq@U)).qNi sI خe+5MR՝3b rwʺ{:S-BHs۵y(ȹ(5~=]cbR9c6([8S*oCQJ5J$M>*FOknm7'Tm_ɶUs,2IpHZhԩ(f|p6LI_Eu9s8ǷW_}uwW˛^D]뫫o؁ipHO'@O/HWKzTC\x74偢2iw~0q)x) V $ +uzLWOyCgQB I|-m}]~ݭ/Eq ZSj#bGII $L)G֖E Vɷ i A`'$1ŌYpxvC#\{8EÉ+`4Ã&mKxK}v=plCvM̷9v'ߓ~rTnMX2]*lհP&%m$P,[6lr)ZW}1XD&nĶ̫Ty6yBaTl~XvA/ѕH譪T/B/Wl塜oma?mP*D(\!r -%& 澇R}˦eL5(yQCA:X,h5F1UAfI ?%E&{.ڋ,>wׄ?_S5AJ,őq8ENb Ґ@"LI9&iZn7[$^K!W=WP슥 78EvnΆp/'77؅" ;ͱ>g>8ڑL`(z 8U9*O#`=;ϸ1Aia LNj2Q0 'Hˍz~2C AHRVa 22LaEcƌLy-#chnDt;#g"Qh'@e* "%ŀ\r b Pc`FL}  wAFwyajx7&l2>]RMsyRNe_*xӲOzl|brcQȱpsxFWLJz t'Q>Zh0cGpM($(LM&|>)\;e3ޕ҇ݏfn4b`R7HzI1I/4TDg Ve/Jng Οm{J:/JPn6]YvZ tpveᣆCF|Yu3dnn.W42:[DŽ;U_;X[dKs%oҝֈJZ~)Oizlm/^ ݴC0̦7՘6tl[+f(|TyL}"iRcc .#%2!tNc(-Xhm RtetۨQ1,%! jCcg}|ؖ=Wo9n7)nW YD:V Gl:W!P9Ӂ)c\$0%7%7Rq. m@?}y4ʣYڡ[2 ѩe&>K{|TrN{ P 6U@T +L0u[ 6o$q>t|Sj&lamo¼ ݇"I yX[[,ؾFI!yy)t(-_D!c1*`d5L0FP#3wDJ;b-D't"5d88/L*PҠmPV# rfb]^ٍ =2_.)sQO H?E[ʖˤU6U ˁO` Pb<˭RRim7ݠ:'I=93׍φ{>;}80V(J@MFՂk@   #Qw-趔~^WxdxPaVXd$/eR;/*cPp߃K02/g/*}u IRӴ}`n9KUV7u=,?@EM?`rr6}x='nne+n찫]Hn̪UJQBEfJ,eegm7鴐>kw|`2V|Uf3rڝmo]ov"oD˳E?3j?&=Xͧj{,ы&,M .*:ʹ΀s{;/x3WA>N6^`LzT8~Mg`fқۑ G|vJ:O %=N۹lL}Ӽ`;-t6Knٍw \ˎ|e|bAF# kmۛKqr> Z*7oNV~\޼xxT 0ty99?[ܐkܹgf_tk,cvJ~@{<0F)\y8󌱊m]jnm}1DG^(i-&7)5s{~Q\܂-N/5139[#&sd:#?F6Tdw'k%ɓwo$Mno? 0SrpZ@ȡrD}ڦ;33 ;E}Y-%5uvA"1DQ6*(x"2rh'My4/RNSݨЗ*DFmyiy|:! /T}fz#1kpwܹ;ޱTJFOJ;^a;H]xaɡ8gg#aMs؆uqg 1LrRHlw*9܉m؝S}6sOa+S'l J6oV-&7f` 3}pV,=Aۂq~jC\չÕp-RU\aNZq;aHSJJbѵѫ^J^qsnߤ␤;q{&@n4\qTnI˫j|Mhءћj&We7\tL'3O i5='Iuc]xxjJw[wET ;#(ž-+<'Ә $.$){`F!6GLl>|l`<]zXR?|v:|7UIwVWNPr793phY*ŠG5|7ɚSJT=0)7K-1k]X=dzLMh#)5`ؑG?6p&g+Lnzhfe2deSҔA=fi]?MZfq31ƥPܴwۆ ٞ@܉nȔp1if3Ȟ)ܑ ;J>v.G̞ }vm‹U0R^l$?O;fۯ c!V%i32߷`'٧Rx>fjjj&QMfdvke~qӓյ{mYz%>K6uQcY6538k*)yaVBqQVueED,uPPP WX{e^_Hr:ԙʵJ!*RfFT9.SN>]vdXJ!+T^?ACP#Zڃ1s5 E$3Ly'>CD2݅`͇dx3sfX>[[hxmb$zj𸉳W`',p)tjsvz@c<`7+/M K6ȦENF.P0f~ٸ,k#An\JG `|V?D}V/`;#lB8 EY㈕~;^::!ў(bϪFMY#gŽ]pp !2MJR)g,H/"w "{`yVZ ;g]HCG 3 \1b!q+nh#;/!P8FҸooEp~lcE0'eU 0Îs1i`|V9 `2&w:nv".09cl(>k0@pzqlL{EN gE㕝vK^,jSfVFjE$rQUUN. JU0 `T(r kR)w^g@uMIm%_R~ITY߆i6{oje^յZ&u04la%NrcXlCOxAIa>йV <+F,Q4#OI]HW_7.[h`B`2ѱc 10pɱDZ\w>[BzmC6=!fK]JmVEܺ &,A'^DQtkԚ$E0XBcl΀RJ/EgCN*b%_8qc)VG֩0:2k'00΋X8}e #X-P ʢÊ2m'r_89涓dA7=@ti+xfK5ʝ@ҫq+a%%;YBj,U A\M̡=C;1:d:4mpnިQr!=!b@rﴴ'L˝%#Kŧ `JuAKET)>[e` E q""}Z>n&,؄S"CjO&1-}賞q0ŧ|a'ZGkO h3i*u,j PĹ^'Cit@j?C2soЇMhŧG*:6QYߨ-ΫA= sr `0Y=vxP:)eAz6^D$&&ŧqƸ\mG&Wv=:gAdә!)\ȁelz Rgweπ?TY2,/AVt$&s^l|o.TB;Å;WͰ2?= W݂m92q1RѬje"P1QXuF#*eUYɜg.%Òe> 'euwBA{8dҭH>V´>$tɚ6^'nz|S˃!`Yz3xg:Ip^{EVD *3VΑbT&eg:4/!Vy^V3W}~oJ˯p*N YYsw剋LlJ*(mK["h[3ٟzǪ0gk@)?u;E1 XGXuFu 2/0s]czbL$^D8 fK1ݿg?,@4Me8G͎ $,`C ekIiS IP:5o۱C hʤFϋ82"2!1R&rM? gݱLhfYN-#ƴ#!  \FQ zq,_ f?,=}"±!'sf)%DiNVm͋WלbF(J/8'g;RrM:{>շai7p"b`'yy ;[^z};HpoЉ>3fp}fb7 $\q*4^=\@<"rn'M~W\`r;9fGG]J ehx Pm{qG^~ii0,;+-@YuLB?@@_f#N~5Q|<ܸsۯ^/)Dľ ]7֫)uOI8k5Iy$EsWyՅ4GKXc}qǿxR-B,C{G ڎ9 Md_1!&@1xcOgn|.+ D D7irG()ףLNMtyB&+i<A8A2?AL f~JmZ"D[ѬFR.ݧ<%'ŭm/fӤ88j .:XH.5}8"F9RB*ۦ  i${3FI&Ә+%R,*0!2MdMUu͂boULLfqtKCmѿ>b)UP(D".T"G2Ip(\K׌q1UFWJFPa@]*)KuyH`r#e̒7e5@2.&$-dbշ"6PP!0-/q^z/yyOҲ% lB[UOr(]   !3&֊1hp!A"iW3\0fɿnPRe :wB1G Myt"h0)ьe 0|ݧ Ao= ,g#@@j`[CMݷ`<59ԍ2U.D X&u xȷAE/qDӧw8!2@5c-P%"m|dNWpb$ <'"HL*=5Zn5H9@ ZW܈%~+k'3ՎT=zwK*CR%:NT&{S5JVBj,s @ iw}aS>" %\6BEoŠ넴Seob6pSR[g:8 .+>77iA&j<{wz/'Q"2nx+|6ӏN$5]i6+O5Z xilF @@Q b,#DL.5*ܾyw',Cbt=!͛>a=tΐ( HK S$@+B&y 6/ eZ5Y/-m49@a@aj+hCEF4Q8n [ T6o|Q BNGOZ!] %o0|n\K7I9LYD<KeP!~×8-*-e m#=*Z*&="D@Gǜ6{#q~QTcyD55X2min߂{w) UFT&(Q cA4H),,XD /𬿾W]0 D˭ m|J$J!OTQbd#eۮFi~M ~B!hKWyǵ;,J<=ѭ,@;0x{pU.*Q{-".Z^~`Zv>Y@aF7"L$i񃲳Q];qiA7iu|P˼PAdQl,f/yzgd$d,8h u'JaDA0%;S{WY5l{XZ+bO|QO$HNA#S!Լc3ay[wbxa[60(q|U>ҋO.\=xuAVࡺ+vih=Y{ց;Uj0~I(4Uꂑm„*f4bHs|UC8keUneanʊQc+<1^ 2|Ցn|$qF[>^YݢzDcMZԦ7xT@-.Os>qᷙHDb1L^VqW p߂ `JI.XyYғ{_sJSAy䣽(Z[}HNxTWnXiךw\Umfh} ^{Ġ 󳽐Yռ&8"s 8dtB?>4%iu][:P>=DYCڬ\< N$&~?@!oxa iqkcY-{VAnG< u+tN>cD2?@{bxK:\pvkF80NWהrS~'Sf K#! emi&҄ͬ mY/msV Z54UDTRZeTwF:A;cDf(Qi9]PDi@  K`À6zG{+"SV &!›:eSKd/mQBhm`(aDӸ>qTcaO`[Wza5!J^5_3&X Hъ'8s~\4Eq%yϥTP%8௏̅ qb).9 Wg5y̤+-E`iE̓h^6"ʒ=S8?3Z,<ކ߅X{(̫"@|[JSfzT=بk_޹# A},UU MK) vl$oOvA%Kb>v[PHþy:pffRP;-WaLs[20l@XɬYnD_)X\'UO@ j"%yȃq!#2D/"'AcOHT;CJSH!t1jx%DiM^)8Hw_w!@>-<#CL'Y߈B"'( T*R <Umx^7*s/ÍKƊ̪ͳxrCom KqfBcK o=e -Jb'd(@ƥj zl1L׮Aal- LSrhqSi^@ppi$9@4w(2|f&L9.F.^㷰EYfT|["ZX2lX0zU$Da t8PЙ,MhS] D".*dgtdX]"~醣 *I  ۣ3"\n=mwq~\WrG$\CNVܳ#D#_=_NU ?v$a*[YX,{!BM#IJVJפ*1dT@#`{G+Br좶 ~&!*r'K%u55 cؖ0Lwb5Dʖ|t´*Ay U +΂>hm-Z6õ!mnᬔ! w Ot}OD8C D'"-LqLB&yd6djьe 0| n6/u8!#ujzb._HmG+Җ8zxJ% ֚wt(,"qqYӮWKϥ2ft΂~ƵGJB  Ϡ\뉲$fr uSQژ0·cםﺝPxe¦'pSt疇8(,1mW`iϨ+ `6(cwU *\i L F h)֫ރ1DMw0 [y v-:}Yu#]7˰FnGr&UfC0g"`^z)~wVSβjTU9٤zWLnebmR";V;Kvinߍ$b/c)gzNBVUt(?φϖzՒ2A ios]˒#p٨qbQ^ b8LfR  (Zi؏SK5P!j-I;\k,>H@@RAvM) muL v檯s-䲻8XL\TiU!3>}қ)߫z; ) @e. +`ڪ fSҚgm1$\aCen@R 4*{L3h$M{yS1i)ފ"#ǣ"'B8+UêbxPfbyS392SIULȢ'+M@эĨ|4aD^SְB )pQ!9ؽxuyi=O*z?_FԊkh~bQ"jBA%*a.CBtP-#FRp{ l =(.#IztNh. +aVpPl_ljBťk΋\R6-Z*1zGd5&.0TG8={x`% )'` hˌ߯~Bt(Īö@EIiNC P-a.7\t搪ݍl({zAM Yqk7Ts$/J,ߧ\xNm8Grs*oYvA~=*? M3&c㆓>*HuYI> e.b =hU9ֳ7{ꎬJsC>{#~DPThx<;US I({ I~<.ͨTDx*Û=d A59MUmVb;̞)׻"PmQAxqc~~/5z0կ: #T^ i5ԊђJa,AA(&^4|s$5rnF?#{.r# ƮD}=3' f쳞͡qIh8mE3m?XPrY +H~#HN/f`|ryqr~G2(L;o!%Ф\KtsgLoX7]0/PǙKd$v{Tz/qkq={OV!R6L/ MzK"t,%|XZ{%\[ (9%?8eE!-24W4ㇲN":A npѴU2uo3=hi^qpA\Lצy^)ګګN`ԉ'N 95#"~3Zra D]R-2>Cb 8ņ2!\!f06%@2/&1CsIR5Tc^l.ŸkSI A A;J`ow2vנZ%f >b7\|uzh)~l^!vFx'_Gkqi'}6I:+t^`Yw +`* 6};Eqqb1s N&QA ;4a%bZs&r mJdJ\gVm(1A WN]i+FQ>* ^1֒KNR 옝 'f>7>)I ”j?)#gt,Nx2kIi;N!S\ߒ劑j9e˿n~[M+4Ϧ7l2j׀}5嶡H>1=bz/|9Loo_nI| Ur6<$0+Fc%&gJ;"j=]P5loJ-9/뛞-/:]D+ o%93Ӱűb6D U\7ς-8bKT M?Fgi?Ssv=CnZB.*e` kYw)sK`Gm`*TR5/TdGb%~bT{ƷO[h-ʹ0qCFE!K1ϐ%;4JBb}ж ՗o9[jS-9s9w Zi]{4; 1Q\۷'AG n7c1z\N&7X"yt O{|WaOdRnSԧ3Cb֐n2$Hq x K{Dtv Zɕ%d1JؕO1 '4[~U>㏻x~ADLѮ̈́X'AG9 X&uTC!Rž)IiX.˘?,Vk*TFBVs ){4Pc(Jb%%ّ :r;ͿAztlCn)sye1ٍ=@ )^c#cF:p{>AQ5锓jT|xZI&[d\T'^1siVJGXR<0mզ}M&gfn$c{X.dpG2q$Zd %]dZ)1СS܁4qP˴Ѷd&ZH <dFZ\;L$~1;0bx79!"PrUN%Ǫ6Ɩwqvd|*c5nzLu Il]6VpbN3=[ e.vg"GvOLJd k񸛀E(.J9b㔄lJ 9) vpR ,Y`yL\ `2#5Z05$kg .넣 M&Yd*_"3M-}kγI6xZ  X xJZS0hjJxO1#s\*,@k)iETjwF]p өv3%%bnG/K_@l&M$( +qHͣ=B"#ifn!% Op*9P9qV?*S?֢Ҵ%Xu,nXMXHjldp7蝨.eM~!a)k'5S\G u]p-W߁D,3]hIn!ǰ^HuE%Tz6MdV`:śh0+(`ɼ*#&rMR ZTٹ}+ )JwZde)yHFroޮ0rQ)'ؔA֏1@NqiI%^P4 TFټ1.RaćVTZ|z ®3F (4^"@(oM}N΅s(!=9.O'E%c6@w*Y 'fl4&fZu''qGךxx&Vъ~$oQ|J[!hBOLȢdh[h{%hS/Ʊ)6c To+! 0H%}@O::bT]FWM*Ā$rJ0!2Ɉp=Do~EhסsGZWQ9Ƒ Vti\];$vGJֈw䔢>X¯vH {wWn0G䮣bE\u^a#o.)4^f* Wv(‘ WTg$"=#1bqO$^#=ґc$xuuU]]u H1;}5jN@CGDuJH)f:$r5jj!V&IMB3udmj  "jjijeZ[>AaȧKtٸw[bX~l7K:A؈bE0.)dRTкJ&J JN^+4,zxu hi9@PK\7ZݤןVc;nq l1onNA pv ~eŜ9L8OR'ARV.*Dĉl sCi ~umԈ+PWu&n:пXaPBkQ!G O,`z&nz őN&`y[ f7 qɥæù{,F|4 q<:q)1iRhHO<\`ɛ $TBó_r~Lri9V?~}Rw#/l|sxtme[Lh"i =#X5Ƈ ctSRBϲ>BOo_eKi:Ƚ{wyEG$A#'(C*TjT%i%K$@3\LRle~xҔE Tkݰr'sM?T-e7d*%T]g鑜룿}s^8<쬋 yJʓoO~t d4k [Ϥ*r[|: x-1(yL X) N~/`lb|WJM`e47d5^k 6X5rVM"&F*' 6[HRDIL1MYeÂ3LĤW4JZ)Y+\qaMr{?J j.Z9S wySg =gvOs\Mq>xMKF~os9\6㿭G!j#aTI9ŔbswJ5:]wԎϫףUMTA҅+q'e UOkVX[CorUB˪cF.FU%CP-kd{0z^ZyI8vv62 ^|QڛQMћ59ы VP⽨-Lf}8ӈ6 yyL$aTHaTMhk2ڰ=Xm1GipX.T2&??~!'BTi.2ܶ-sT ?LKԄFs`1B"|YqB >Ruc}ݡ)!׵[[-{`5;*(O"BBQ cOj$VS_z0 n S #nŸߏ>`-?0.n?%nOnem0O2TRCېѰ/pUvo{@ۇΫ^|~5܎r &S r.9g{r6AY#l͋1#b?gz&Nl{oN׉O&㿌̵8.p}4Q-Fq;)a]W͏mg8ٔٹ6")^nfym8S w@u+j.w!,si7K;iH}ueKZ9+ܽw;wo󝻷;w 3郤A f !YϣMb("pt,Lʠ҅*Jߟ e:Rrú?Q@b!o?l(C嗹ɧ;$PQa#KkSpiM=3dɧ*yv\tkM7'QvȨclE|zpDe{D1 rx._􏼸r~ l+1xW |<ˡ'@e*lf1FȲHr=5cNp :%" "3?2ܒQc4Y03YM~ lO(<fjj#nz6@zb*f{ je7 _~{us7Ji.f{??U73Z9Qk;=ios=iwIC@WAɊάT4^geZmvЃk[͇ZM#HgF4y6nO8'Et_YJ;dUχr#=Qʶ> exFK$4vdTB~X)-$=';\*i+'[jdJ SA .ͻ~Vt+Ѡ.3[?EK$,=&n_.o\nPz`4lv~i mR6iyB R=YORCJ͹"]7صW,//0O"_J;l̚!RJWSb=嵈3P(APJB@fsZjí72-w+#k:w:E82ecGCH)K8>X`AOzW,1nM*ў fo֘2K'Mڨ# 0+# $s4ߍ0 V" KJZs}qdSv:fHXOQGA:`KdRF q( a齖2jFApx;I5dm['9ڟ܍/NZh> aveni2^ 2RM&L/2+$Oa"h4$)8* qnRC na2RgTNh&9%-!yU`rnGpnjoESWrho"|p4ouR=ޚ?[aI˻Q̥w 6'{ggRN:X;9BV>:ȵHލ:M'aBn>]ڍ!>aY%\eƸ8q0+dgX(g/.NV48?[T҅MCM84J=֠WRD=|EBFa kMBhSrPӼpZ]kx9E ֎[oR36rK"%i0P_UЁk2%hS#P;(O@yjOhuNY-p)U!2xOS$C D;2pPT{ﶕ2Z:Kܡ3_>|h'ypj@+_׭<׹YUFyCyEFF#R#I\uh+uBBX+) ʼn#BD'D`\I(J3DL{˕5X%##?Qn ecRJ\R.ڳ;Z6 \ !Ұ?."SzJSc HbH"T|>"p ^Jn)%t0ocC>8hH;됓ƥĘI5r 1aրv9"|O;ӳ.a6/ ^aEǰfdAѪ(&w) Z &J 8_2M0,hmJ `PR` /kڃ2P $0U-#N\*,"Cv*8%NJ`Jl"%MAJM k >nUc63w$R'kZ{kgx=D;l=Ӓ+~W}Z= 7>uhȷGg??P3~/F bZ5\ל|~ח \U%Cc]UGC֘=g<bT̝ȩ8r. 24^t-܆/&\6ö}+cYuX;f+~y} 017 d%1k C+ mYq VY*ϞQwEVFv$3Iv/;לd.SDS8cD[XLNԠd"8gEgP:[؟sh<5B>kO qw %yJޕ$Bi)}Cm 6z2ۻCRh:HHbUY2 XQ_fd1<*%D&o K*@ck~99"+ꬨZk1J8`|2;ȼB 4~VXbh=uMgb!L2m#Pe ngvqVRFoUL%WAJF3oM4&"c6|`UIƻ\pq7л)C(7xw)ӡ3fP0+5y0d>CWϐ#8dчA4TZjțX)z1FR`H nT9G)֛UbkgzR:͈%\ pW ɅJT4^vQm. ZJP ՛*Z*IDk` rD([?8;aaʥ!,(BiE<$lNc[2k0WU]fK|^Z(Gr( 8ۘfuNR}Nc&EPzb WFnBm-LS \mdܺ2K3DCH j|7.{Ylk"44*(s ~.8=)4Π5WU(kYm?4pFr;ސ[=vSDH :}O6`.[nzOw7*DpΔB &nNO ZݟJݵ3}W`nbJLEYXU# !Vʕ`D8q hn=~8ܯ&(]w lUL!voO}|bw%tpR# EmbtNc|" d!)qK-)j'U䁯cҎ{T7kJE5S:cXo >I+y rh+|i)J}is#fq:n1  iIOBhZ qO %[(g\f q68ZcsBxnգC:P /QpԁHr[~7YgHdU0Z6'?E5sjO5/'<'ٛL>v~A4έ"$5^+S8c8pvEi#<6߅l=iIɩY.YybdX=5~ Ld TYt5rƁG桼[r${5:)!463@A1Oo>:^8xRTLQ=/R}^i[ڀLW'ikX{G-ⱋqomh;r+s8djƌ6mƏr ݢ"4s<}.S–WXƾ;5o^+"u֡ivWxHֹ\7>C-@%oL:InI@UDו!Lm7!\i[dpW"AKfV8ckq1M;%n/St];K'vgT ΋!x0xȎnhzZ(I"G !3['wW;I6?I>ėy Gz@8z $I?AR^X$/B I[|dqK[4'g/2i)X;xR0It$Gݭޡ~ñt5>AJ;( Vm^z2Zic<ÝueV竼[ߏFfLǓ&9]㍄2yrp3%Duu%4kP{/W~CQǀ{r[+DpOluhέ z=Fax|bR4fIPg{ϲN44TH(@:"p(LM\]zؼ)X.d\E-ZԶ7|]Olp+To@QfP)A)i@c3.㾂\kE/ltmFa^02qAs KIc#ig 4xNxFi%on^>G-zoEZIb[b|駣AI_r^{y` q*z.Mig<^k>LiF p?VA1N?f=;{EՊ?0;㖄 bI"]}! 8qz ӹtC !̴е l3I>ٻBZb;T^_Ocpfnb9Ԏʫ K˾XTN?S}ۡSKnIܢZdؗ5"vz`%Dlv/Ɵ#l ӟ&}xC[9S?O+mat֮վ^$f}trK BtatK] ֮vWf?}T WH;RD(NaU(*)Pi4DXgLn/6.4Di},mLTLQmZyR]*ƚb{Ah5qQl'DuKC㧱ވQM|%'n.v+59}f&#RR:sz9h2`]3<2HPgLꨍ0ФA+T=e`%,=o5O6>[o*lMF Y1";ǽ˜)D TL#eFU "Z"qhpD=Y?EtaY] ׅ xDPB9+>}hG&hr=qc'Ettt3x?S}>>RN=sŠ7/?z;}">LG.v?aeu3 r٫=:-xsۨ͢"^쿀xb Lh gi8q%X(v POVP?֯~r3~9nx5Ι 'x}4EɳL?gfla'3f:5`/ )+ѭxPO$ uy?.:-'/2AZm@ul^^޻4<+;Y.Ii 0[8bEIl^$`3-Wփvzit^q>7i"y 7X[VUOᙞ;TrIzFU!JQ}; ŬJ.{?(#M[ŝsLr(b]hφR0mv 5-]k#/ԖiL[R(ؒdVy`4uce1^\]h܃ikY Y9Ay6qknL`ApOcK3TiC{97X)hL$ӱtz 36/:0ւ__O+Nb 2E16#ёsNlR0~ĸIkAG~3v@v#Փ|=+cÐ= Y_rꢒGⅼY֒FR؏A,ތi{Yj1rM &S+'!|" <=#+h6,hJ?wSڰʾ&S(KZ/YgTMє^-ḷh㓛Řrn'f-gzPGZ\߇ ٦GׅORbz7?ΗVoV;ڏVm_LeTP4N{39vg/d|ɟ[)KZf@tvt-Ő1en/{TK8 ( V@zn'=ίm6LɾcRԣ")3E;Lc8ڸQwmu>}TzܼQ78/]Scjd\-+NX+[n_e \n 9&~ʱ֎63eaN۾ezѡqxP2- }\Y"0~x^ƃħQ_s-T!+Y$(6krD#oR߄ (>|Sʉ1Nܳ;WW~VH'ê= _ɴ_ZǫjQ̻Z7沟6+ODF8$no%6lƣ/Rc1ow_;JO8LvhrǏҾ pil2=F9~=}V^c9n 2"/@jTzbsDf4FiJ 1d9 1g*ZTt hEGh87QF;;_gG׳3;~W']; vNmgWg3|/O~3w0I7u@LjI*wT'ڻ s']"ƪY%(=WK!Y%+h6!#%2a)685oӋ.:-1v_u1DRe(>vAY)L:l`eTpդ BWk6IT\֜bU˖AɻRp~veDDod hXm}]!) =SA.!/Nꖸ .DL]Xdk\T)f?N=np6-Euv:Sj26LF-@Ք@i,hm I% [NEUv{ Y 62)AZzm2q+\t6}.s3-~ +fm1u윰[Eu1 kmt1HAcM+J,**WU٘(KRi-{HW,l9.?D*Hn᫨6vu 8@{28!xC>aev[.޿],(HoPDegf Y(hMVLYʅ̥ t!%vI*ZNUNCdZV|t>sg,#jNfN0FȂ.boI:dLv'mb+^wydA,!L;yȣjΫt#,w÷Ոt`^y1p&hƈF@RwMeS/= \UaZubL؝ͭ:3eڼ&ҌG[vKJFuZ,[ rp} W /є`7{0QkzWLkt6E]r5Jg|b 6'b\UNڽR'ow_0פΎkOhf 9LQNЫ9@k^,ݙ|+Kv]Lo_~6cW=-ExOύZɍ2cˢܴDh嵹6}@7VHkwh>|:_mݷyKۼjU, L]${ޓ~S}}v́PAF?$-h+žF9G|&wpY.jr8̱]O*5wb6kJ-{"w3V.ch[$T-c5=+v3m==͊2z{g3;u8E@v⼡dO8!4[!6P8?[Ѣ3G\}I"6e]64_[/vjю.esC(LdRǧ!O:hRPwr @2: Twc\y92 ǹȞkJ.h +P >4\u*%ݗfbViI.!em6&~A[J [ޘ)g 3cR1n݄'.|ЁMW0 X ^|7S,ΗfLC45-X{ZMX2Hl7GjnEf{v#W|.K%'Иs 5Dy]±>G:SφA/ybd+fx_يŠGZeV 'SSF9&:h~W#pJ,e5e WԎ ޥP0B5]Lѐ[ )3hme HH]^e#~V]܌ތ?˛ZX=GkZf܎?G`:#jG2gQ_ƌHk~Cs(H{D=M'7Bgk$ ٖYAyv#[-vhvŜ(uEoTMzv#o0sov2kv-Ӿ|Yyk ^S79} klPha ]4,yҥUk#'EEcu56CȀA9<llTAZ^?[}94)1Y&c>&)._>Y V"6*kFS"p<[۠gb|Wai zvA T*Uﰪl!l'=X=dClS=U%m$ jVP|`EbLEߊP98^amѹ;!ZV7{nш|SA=zM>]|K<±%4i`&Fͩ1iz06k`:K}2@1i=.'ͅO-`jMo4hzԤ/duw2v:ɧjq+ 2=/O=MoZf}ݛ&]!Q89Ov췱~K&Jp)л!DvnՊD4% fʼނS*Z28-d#d;tmzUh]b ~2/d= s9J2"UKl?Bml( kJ-{WASu(>`VɹQz6?19~LB-f3ňAR! c(<.;ne]1iÜ#d댭Ub <#e[6c&6 ?]mo9+|91{b;`;~|5eGL߷ؒԲ[؝q&D5b-%4g.!SPhVm:vP>",]XBmy_u] D^ߕ?X"{YP[̘ C)lm3FhcTPfrX2b\%C N%%RNr h3BHʻbD<٘nd* ض ^ O,PFc*GٻjjQ4V\y6Z5B[*ДP7þZI\hɇgЅmo;] ~جMF 3킼5=imDP~R_ojc;GdFI₤UQF2 3/dvsDP\ [*s@e FOb'_n<"Yıٓ`g|RTxJx ZxLɤJ[ ,Pn'9rH9!)\hAr:ü ^eqoo`#|S"<8Ė.ZNcA: NdydO[mZmMMZl[×kZ۬&jm`TV GplIgJpl*uCMr?jk<o~ZLO"br~7EiPLY+jMHa.*O ѡR U,41j83/4YDٲ,)?jF>4$p)],ȲШ1%z*ɳp+ARj_᳤cCOi ݮΏGNVW浐1sM&g"AiA3,Tv)i/H5tT>Q ǐv4ݶs2u=A`zwxӬg~S&4񽝙P|OӋǨ |'LrMju/‘ȫ>ej;3"Ʒ2.y!jYHCQaKݡ,HqÜ֫PGzh(ߤ0m5\XjWg=la~ߝ&w??Oa.K#}nzr1UT"IǒN$1v74z)wx jY%_TM no88!kŐ ֬EGp};]sKa<$T#ёء;Wqol>Dg<5Q5ߍ0^Hi(0j RG&$_]DK!@|JKhxwBa?>H%8FƃMYv\jg%KLF:OF>?:#х(s(4Q!$/^Aig=܉VjIBDHkX)=t33 GaQ&ۓs6TPyEѼ2H&l6XF IrTeWUx.@܅ւ8ȶ+ 6&+QG-W~fIgY"5?8 䱶 e%MVy>VuIKOJ]zRғz]V|+wؤfN?}l;O E XEPK$ ?%شhmgۘ]Բ*G*]ޖ*Onl6:\![>Y0*VI8J6jiy? 3 :g:'9|Rkc q,vA2R6jMyh<ð3 BBY`=. ih;@Xx Q@@wH6ѡY\y*FeUI(Y|x.$z391sb6z;ZӻRBR$<9$,ϿW0_2js-0}(ܻq=ģ-xz|ӭb3'++WcfK'QE" * :SJ%Ft>CH%'AبJlΡ +KPqDr0(uTa* mVQ hC$p"hjK] eӇחmME6 @Q ,)]LuDqBc.3`^<4v-φ(KǹH3Q`I9h%vW?VƜnwllj#>m:蘋߲njwW}%jz͑Vvъn/p=; VhَUz+9$AcG@"(=ȁG᱐t1gJbpRPyZq$e ,CFJoZVySw|sgDCL-I9 ioDfs=#ZdKE".:[=Y_/6%xA5]]MY}Ms.v?G-w Ҵ/ə J)YF{fdHtYY jc\{`--jSmAo?-Z,NϣۋEMHư!b֟DR _Q&͉H^K%^+p~=%2X恙{W)JoY*dTxl[u)؏ |:jh?C, f0]kt"h'|1:rg ?lW 7sLYǵZ.g]; 5gųT8_n>+д괺Ν6G Te.o#j"66._1o  qF?6x^;Eۼ> %)QѲ:ޜ+7D0nBo{{]?VORO,eװҘ#U VQVˡTp $ތ娟 ;@-6EyE%FZF9ݒZRGحK9TAcR)<݀=Zٝ Yk4U/ I`F6<͍ZA;RD_7tUx8iBV|/+GlblBGhɜ<ӣSRBZ#U m iX,L_0CŤ@׃_-܀!Qe7\ Nŀd[ *Q*>`M1W\&1'^dK17`ӗMF+^L$)Tb"I EԬjS昣DM0Ku!n[CJW!yĶʃCF<yJJQPB<ɽ`D`!TU E 8BxB2Jd]s|:|#,Y-\X5J2E046dI0Q|9cȉF4ڬKV:S"dԭF6ͨ\?eZ$ PL5E|ZAɇJ|5NO{A!1k6cVFE.duts!EC D0k*ۜ.O.Ⱌ-B8M18#TR4prr -}44K(D)\<;<N9މSrR-I]#^">BϖrgO\.(yf ,o5B%~e,klx&j.FCFc("{oѠQ %aj4HJd@V u6[TFM63E3gU3HFdSj%ȿmtWXEo>v]K%_fsx/^x|>;ۚ_}OjpK4GZ}Crȃ$ZPv|wB'I o^Uhm I1@i*LIE>-s<߾MjJP\X36=87蘿Ѓ)uN<.՚HãK N$)/S%*pU+ahgi.yvi.IKҬ탵Ks-dgqoY[`|2 StЩ=*W֧IAxSWs2 9Q[)qM5X.GSZK'҉\:YR R #ɛ:rdt2jȓޣhlvO7x =>+6|ЦIx!~*&Cʔu ?$XU|mlK6gT+&6k24E9VS"Qj`gC&1Z^y2iV2,Z/4C!ޟOlq /y m0yzӪ.0 頰PGIť V3軎|0]y2{Mʭnby$Z_۵~m6D]sC3թN&c +} %;-6]&HHU N6ZbܔTGS6)QG~Г9PN[};O?/<GC5J#gm7ϥƙpipf> f3#|j gqI4p&iZ&S}i-1عg֞9XCs2z>/{j^vNGN l:Do q` ]Ru\AkTkL({oFq*;-%F=)WPqL;2}z2Mը95St"k["IO!~gt4C0>Tsw [b\L&+$}.űhm*ɳu*,KN+y Or ӧu/>@l++q=ag'& Ĵw⩩Uw [HXg"OcNN4ql}ѪQWJ}-,zPowYj9pssѴ{徚t1$u{dm2o`m l<}|-hh:ǷC{K/@@z%2y㠃{.$宐W[g~<3:ő2~ڣlg Ni1)#'7'3`!FiLs) tRxH c%PIAe@iV 4&@MсB ⳁuʩH4ƻ*4p̞qodFgS'}xR n049qeof~%y՚.azھރCǖD&xunqrkiv_~]N=l\ӵ( | s&/M*1sanirNgL*NjlJ^)%g]eg@@aiXR4~MUUdzTQXCtih7}Mtݶ#)qBvkI~h< *Ur_rxc {7Ř'A6Yl-7ݥ$6.iYCz_gp(l0c籰ZPCPnhX`ʁ. ue}Y Zم>'j&7Rfvڙvq>#hg1c< mV|hvM7}S@vM#HXM3 @ln[9l}{{-!;Ro2H;qo|wpz.틁h-EpiTsR(JSŰrEϖyËÏ(^|!r~8 bɓwB a-QkzHjyi|O$ڄz >XkznrXd oZwƻBR֌<3fv Gghv7J ~woW6 XB W\re@aVy҃.:kN# [=Ozݭy}ۥyZy.},hD^ۤ7TP.@ []}z#kX?zmbYB6}1:?ҰsN:M|[A[堝`kԮ{wz[Ѯo~vǺ?j|=!j\HJ/ rX4Rm!-.t̾1y+{σ>B76{':W4fM,I8lUj-YYlVD.kXɺkФJR5o".cl]'Q} @^AȮޞ{;+y"&cx趫 d$9L"9,5Lz]%~lg/~ Wվ:k|f,0" p}N?_Z΃~wVjq;>(ъ)}Fٵ˭;fCߒ0{(s;.~NsjלЭZ_˿ eWa\/_v9S`Vϰrl??M|PlAb*9'IezP\ eII' 'Wp Ps*v%ь )MpBۮ5_K>O,bzYW.U07j댇S؎.QWЭr1R?=/_[ ȼj9'(qK̃5M6҆;>r tk9bzˀo\=}mɻٴ%Z{SY}.e(8DȓssYct89z}A/"g`No7|X`eŵ5 .eb&N7m,IR.͞ɛ8$GIZnJ[UQ)4Zy2dsU;W]M j2aƷ,ڊaSRa#n ֌,kkjHܢnUE<89{7ֻg:& "~$Ԃn03ݕ՗+P;=Ҕ\H ;bwR>HZA)I zrٺpF>H43`FHVS+r:Z#VU/x7lqz`;H!d̝lr(Fٌm0xqd4tcl.oZjnJC0 d T}?Uк(Z46Yr/"UBVFѶYޑP!z)S 1)%W>DLZ$P+&Dǘ 9Ǐ)1ɥopwqGK2s06hk!n(R 71>@Gp2bԌ}LJ'3Θْhl o\胲~ke58prq8nvͻ9nf-6W%܃$;ITQ6탫NLŶPPTc䶠cupްjhkdoڧs6F-fHv,x-a9'D65"W׋0yXh`TnO`k) ,tNiiϴ*IQ1)<v2CpRpLS.2Yps*!F0271b@}Ka"lD [6B'үk 4hݱ%M6% _%{( cl*<47zmϨ#?"o&5 L=سS,(ċM)JBE9wU ֓9,=Kɐ=p&he% TDz-˭/n2YVl uv|J<rB66bAfɪ%>Ox~q:HE=K9]Яxt_nToR]E YmIE G\- T~7/s t⹢0D'׋'xBh0(\m>!tٱ=hƜN&'5+ s 4>hwh̨ڭ-cJg㗴f>{ȗ͙}S.iϛ(+Y@6Sm{3)k^3f\BeƐuiLg$ !yJbٗQc<+G Un*{Uy.(ekMy3I] KH>< Ɏ{b"aVy C@Q[b߻fLߗUZ 8C v5@JJ'ƽi(h1CY)^Y ?Y<Ĉ1H& k`ngZhч1AjsdРxzжMzedPHQWA4.c[Dݯ)u-<8)E-7@b%qH.2oZb/aU}IhyL.Dc,>)*ni4͊g<3-yZ0%Ԣ.mwKE=oH-B^tVcgUqIEEP$8b$ d6j7(Q]>OJlJi; J|^H$/7%Ci`o='W#Ɨt$c̻="t@߿p?W@X6_r>_/elJ< M%[_vt37i&]2F~1MY8_?KZf/eT`-ϱ\}3>g,Lp:8LQP4h!DGs_GUɇ/4G|8PY_v4V?7ɁybQ+%5l?F_#L .KrS) `RԽ7^Tk u#@NPtv`X{T5Xؕ}ސL$һ3l.6LY5ywI#kbܵӬʈ˫SS*V;{$\&KBa#LLR|B Ӊ,]* աuqd)H*l:e )_o_[ֆj|I7~hZW{P4b LvmZCo%lJp^[vJ@Jꐶv[psb ro~,cZ5 ݣz_V.b jx)S5߫(k~[ɭt!/GH>k [GuW OrG4OcZ󲷡ؑ'y0Z'4Fi4NIAKqT#SQ[x7^BVxeK@iD1& T+[In9]XPʀ5ܑRΎKFk%"Z^ԲCkG5~5jO-ag2(up}A urYG]50 ӭlYd[c˂QlY정BU!So}}oB zpWeRԕQ(GRwh-SBMk¿i]o=4',>O0NWW]JhGPPwB)ɗݜaYi8] hHV>vhtCNK jh0gxhka!5Ӏ#Et}?yQy 7?GٞJ-ǣI&/_4;OS:T>7_C )RuIȕ,#YH*e4?N '쏓|8SP>@1U-UW3~x贠|G?|QMU IUEP2, HVw7P %@JskƳZP [$[ 9q XSl G ku}mLw?0A/x}׳x[~c;R 1])̙%ȿ/s|՗K?h&̖/{7o}x|K&10][o:T6e\TJsF϶LpՄ˿!gͧ.F# ChČNG3`5sٿ{z²Equq{z~i2n+wq0)M97qv@&q{]ˡ"& 0,,f"Ŵ*uc vPpS$IFXEi|Y99X;T娹=Y0v?E"BrUnO֖{ćkUlV@OP/o_&^iqQvut󷫃mw\mɷ\f$_SݕZWvH0PR Mah @S6BO0]04qdPI6%0qeT:htJmgAÖ+^G}d\h4ԉm{2\*|#Jnd6lb<;D 5}d8 B{m!D.S=}N{21(IcI F EfuK?^{n6(3ޤ fM*g4*(1MVU҂ y(8fڎAhff#g{8W!v~?!f==S@JEU1|=l,Nl͙zf b$&Q2F:Qv#Uhm+B8@”X:#IDR9WK%QFvSͼ$%nn̦Y4ܛ}_.!h;,yJl %Xl5xn n|L;!Ҽc]α/&:`p ep<[mȦ;J]v SkIgӚw|產gRĦ6υAZk6c]FUvhanp>}_?GVct8ao?3GU#wwݿf:vFE3E Dh[o= Bx.C poj쾳['o%gprMZsCa)NKP6zz?*E_iSqMVn%'yw8z.N:VmݿvGxi0&{6v-K߹8Y29鳮AZGtSycIDWsqiB*JZYz?| dΔRF';>^xQggg9HmRp<.1Fͽ8It59*ny5&`(Kfu!!,>pAi,֊Jӯk DJ8FԜ$Wod&rerF}3?{`b3:`*tkMso: Vo#7 }xX|:'I'MPN.l1}!,E uuW`)eQZ %]J-0iShO ȣnO񉸝ޚB]_>:65]a-ikӬFS`. _FD4n*gt)鎑lпu6tFcDųu*R)Q E0*~8 |b5!jD'\[Xˠ$q+EDbZ;M- Y:ռ8p@#Nռu$ lC,>u_ 6_wVVIk aV UGX;:uGR̖TEDIz wWտ,'߽xx0/He&#0DqFh`(=[-CC6"Cm[6esBOXէJּy3 kHe0>ep^N0JTvN\Q=}aԋJc)zڈ(ɞLNZ(7_"?뀋FWOp$6^3/*^,lAȩammu㓹ǀSŲqDZF~~Jk:L0i%Xaa^1Zb0Y*",P_jgKN/]=jAD8&;I<Sgc&j/K eNGq#} 85<"j$v8xoH),%#31 VQTbKTgeJXA=8u<ފSyAJ"ƶ!̒"% t;n;fl4(-Qbjo$vH*D[g LHgVi 8{J,|JO82:}@D g4 eѓF-GT/6ңrHRNJ`hCq낊 28=JNwۀ\m:-:/HIۤ۲%EH, *>+} n0;|]VUGufm<3A|TTyɃ.1f'ȫ&ug>^.$ 2ݏ9 ANߌ6VE\ۘ#FzI +ݭ^?sBKPFۻ{AGWdLg+т H]`VBK0AN& T10WBä,N$+Z5a)n˔`rǏ7q$ydzg?:ğĢ`P8grDn4iܗ%^ŸFh QA4 WsFc4uB[{v۠<5NlNhk-h1?ik*ח ?/Toݹ!S v+dҳPA>eɹ)6|Ъ RÛZYI^&9_ˋQu=$Qit^x1W(ulp/NHeԑ@'sE8Ѿp20TXS)ϟbM% ^55G&Oj90O_8´Ax G_)Nk{X0jգVO1M6ʎ5rg־Z6Ώ:8_ꑻe=oDntMb|xjA_;YM!kh FxX^Cs(A8(.DW푕80YYO&Zx/j^⽦o8`#4SaGG oLJΉ% $Rܤ( $ ҡEY!" UaTT4?M}] X b]DZ%{&ڻ^C䭿|~߽)Og4q!p7cn,CdKlg0:f4p{[ L`&_(g+2cK#Zj!nlJcC~{} K~72 m:\!yW6?]T š3;Ϙ>C~ovaޘpy< Lm2pcyQ5R PYH\ՅJl sO'j6JxJ@أ~t,EsOVDVdu]4,UϿ+.ԬvwLqWWI-6M&~rLv}Y̡Q!88PuR wԄ%B1FkrFnCF]&~ ({QV!'CX+!Jcۓ) '爗FZZtk`^y)WZb>xB;,{dh{y:q_6ݝ&YpdlkJLF!2`YI4eK,v! yFeC FѡhXiHBȌ.eP*y.'K4' cG6 T!{A:(z8WRtZ`e4 %"C ۔jc H8wPύ9[(E#$fE"O L"ԊǪ@*fK,тIcT? Hm.+ `jHelJ%+"JXp [s2y!t(KJtbjxrcRRVG uīi\bRHQ Jx&`0DQfFTTW`;_EDkU}#2ibwO29xt=vuWywU]rQ{;aѪTk>Edж`Wf(l.[^yTu,[EvD$EM^4I{MYvhK08De@!N7<¬{2H⨐ )lNe/*;}"(U:/jjRC6a%P侪^_6k7WŬ$Gޟ A$aϠY)8B qV(qIpaJIj!`"/D$sfc=3ኄ:@!c pm4}b'8sr 5X/J[y<ԤlZ'aA՛ss0:JpFn1]"Q|4;6fڊX#]g4ZВin.KOZDj-fD뽮ݎ]ҌEBHQus7?0n䲸*O=)ae1Lë^mV+NJ b*B]ɛO|T=b}]U<.fMX߁\Ќ[.jb֒ҕp,:˂;U(K^sˑ5N",95\jU ~47TɻTkhkV$${U;ƉsO>>TFk$l[k "|wж;~&m 17E]0HAABT5nmDv\Y3% {r[H-tk,M:0zD°ۉ /&d՘ Amh-}:8Va+PqhkHd/>m7rȫ$9TCt;N2}{ }y2f ؞1' ~ oڧތފgol}  !x,Q=\m~j눧v.%Z]_hHpP(J{pGEttFH *`ڳB3 L /<9Y ;a Mo_+Vaٻm&U4cJx\KNK&I۹i:$mղJv )%IYq$]`;L$*^Zͩ.QhybˉFughGL @"q}0#Fw,Pᴠ r&k>u澚WPq$ibMLd5F(H…, q4`n30<}}/U1|*SW87ZfTTD=jLPKXEn\&g᷹=ٵKnnӏ<}%GF ;\ D#`Yv4XL3; @ڣB8%$Jbb؍5o?'7IRT{qMLJ\vcW)ى΋\{ zT,|ɇ4ȇ<^ԕV 2it a.3f'iGٻNNd]ujr@x }-أ4P@;mi94_Sj9qH 셃Q/4N`;c9igN0fxͧOS3x6ӧAL']PBMB 1tLAh&$Hsfu\e?Fu(0~L/`#%݃OOFL+Lf_|zs5.`90>} ̹φfp]QOO o2ݧ, [Y?A˶Vej E W|9f^.ejJ3ccXA>5yTI?erZtK[u(u V[[14ʒxn9d%Y3ZA `b MYxw ;W;d.YFyZ^(UfvoX$ IQҎ|궐ܤ-ΤI 7&geEHPܸTuފӍB{GFwa= 3w|}{Gqs7j/Fjx&Ԓ!<i~ih"q`& sP =v2~m(~eP/֭G)4w/趴 UtLw{'"&ac K4]%V T}o;2#ٽF2oy,FB Cnkk,t d cj?A[VTS¯5j C ;?hOcdb$QA " [E{۪.OxaWJn:4,Х<4X1RQb 5["C:!]a["]MQQ,Af`x>2?3dT:8ezj4X}6Y*RJTQvs~G1 tjl̋B<2g7Bz8t;D仟.|wxqrK/o{~{4qĖ^aVr.~¢o8#kgfb|Ui{??>Pr:Q|G'/&o!%3S4r\LLXu<M̬f>]im?":Il MJwbh^.S&&sAp{f||6{1JCG-guȻbE< 26IahL*0 BwY", Ehoka+ }KlGlZ]NVµVluV%φoq`t EFk !s>wUe_8*H`OVFBB5qBdϐ uXK,}gdq=<q\ę=HX.ig>H~g4!%[7`֙p xd=#;='0.K@)$&˱JN[=?e=h(Mp6B "AL,VôFIb 8(\Ȍ؞R.3^*`#,<"Gk%"\J[]%HOU꒭.NϷF Bt,S"l2򑩒 spǞ䭱$}K h~ \P<޸ǥx"P!mM)/jo;ռ0 K`.j1xBD!\U{2Hg0$\ݩ+3]%\]qAC^0f)ԇH*9*E(gpZRd tÇ'I֎b*UN8b\ 7z6l ] ޒՂO_:GU˩li棏^!\0wH DckK&ZCd*)84[>*KB &X"yІKBH-..\7DFTv-bKbn Aȃ_.T2,W-m|}ȲEƑ)5ݯD^qCY!W熂}w^5jiIl3Wr:0ϐ+]hvnl0ʞo踓 d%4=8ۡ~?NYPR7 RFV6;qKԍ&iDwݝيB{Fwa]/`I5({JIDݜNZ$Ć1C& 1bLk$4lwM^9f^3OdՏa#ũr؃9bpF%BaU)k__~p"$?  ^P{A1bzp)=Ī(! =hF]F,(5ag<ꌜ14NG tR |xPךC•'FEu&4 LOs ik՛jHBkAJTp~5nc0Ӂau$/L[xa.^Bi:\<ӉϝBQ'7SGCw6d5x|5!H<)aN5PM`$Ic)bG\k c2LG"A1,Z@sgaQzy'S:Uw^ T: 1PW0a[P,Ub\T(GT8B02m 3&AZ(ml&"X*##K5ձ DikHZ F2a%XZ $V&FtTك(XSeJ ʐ 2` XDIQC$aXX0pkNieD&aHȓss5T̟ UA]b}nYabͰ;,pꜲ.9sz R'/݉2ûΕ? XÜӗNɏGYg0]oo@^N""@4X2ο<;1s2/9R8Hv lpV}gh]7-ɭ3hvɿ/Z9?ʶ^׹:%fKpP4aφqL0yٍ`'R ?˥;o{xI l:`ޛƬFD` c{ U_(x?q+rKejcb <Fe/ 'dr"}[ 8#OW[rFbG/iqzHVYO}h&*ˌ0WZShK/aז#J~*(I-=Tެ@aaDyG>Ү2 bVNMթekN=ϰ8 ^K֯22'3WFMCs>H't M M~!rW0 ЙrdPǘ|- =ǡ.Ns?@4i$u|l~N]865 JζhkUs9Zmow߻^VBz=5' :*:\;u~' ˢ ڼSDWkX/=\A@7Nrej=vy?sW=mi&̩ikzx:1%u}ƒ{pqTl!; 6=0LՔO/a,wx [  Eߘ-T Q1g"fA 3wGE09ݢql ϱ>c:Ah]V2{gfSNsߑoٱhmPgr%FJ4Y2RvmD2$үd$!E%eFƁ &$kPʄcZ!5Zz oC"`ֶh# Z8e^8YR0_r7>_-*UˁNcP my6Lk-_(IY/MB0 ?[ TT0K)˹lx⑛tf6:I@OUAY^RD31@E0lv秎!UAɑ`{Sv/LTQZG.Q#߇(\DC`%i0h3Rv#!CBzg%h0t!_U]GKe֒k|SC%x VE494y`F2SxS8srvr bȗ'_Zɵjmhw=6;ٛX-ϛ䇋+:P~˻ npvԒwceT?Mz q6E[JK`C7q a6ŞWj#;~*9ͫnZ OL'mC Fy^jBw0ӐGؐ #v"R7]@ѦbaWzh/\`$YN̬b )gU\K3$ͼHhT`ɀ*[e]VCiA^%B{z8)8ʲPPL(4(P·Aq F"fsMZMهq߸ESObJ[ Hk<skɤhd AU* @nXxH\DbsD:$mUTYHM#TvL΁p];?"\kAT2:go%2Q;a"0h0@:8#?AXmXL8ߋa5d:lwڞwZA0r{|hCKfH\^+2١zh;xj9 B3f,2hl;QyeG(5vh~r*sܳpXoAR`ԛSG,Zz L ^[ RH%AY yL`O›LH zK&=V"BclyVH#9ﲺlw&xz3At&X?K'oZ}UW/ߒ9I>]ύ*!Ar%ʗe]m.׏"ij8ƮXKfִ62rsN/O7 N͸~'vbǢ˝k!}K,J|SO+TEN cñ ČO*:'\2eDQ*X3.Q^bqWsq;ZqqȍffxW xo¨kW1U05@Jj (M()PD[_uGSU}f#}-;:ۻ(u>(%;.Iק(4De9}i(@ c)uY8e,/EbT͌%qXf= hNŸo]385 !ك;9YV:oAkc-c>b-)mZʙ;4@b&'_"BQ`]_0?^k*סǿ>|uዃ<(ʋoZ] YcY>ͥkEW9j*6)bCm...a1R ^CJҘZq&0 61$x»VtK;p^'oHhVo p=8u!Mw5nSv&lrlXh04s͔ZSZaOg'vvWWa/#ZTU *Busl՘0ƎoUÍ$goP )ϿO?Zbo5)m7/FC2)269gL Az\*+,fj"N^\l.v7ޕ\5XI&)`^m ^}L/7Ww϶VgqǏ6o%}G -Y Q2x0 Kt#CWˌgSbBg*x( q8jpkpUZ@x1%uH^X-Ы#pN&BT>8 `np+7[}GT)"⇫IA`h>y l>!&Eq^`;@Ў:NWW+SW2  ]5Z҇@KEuv-Vh1PAN ]d݁cݘ]S{}ǑldP8W/P!g<Zp Z}7sIiKWz1CLy&wY~AZ2iR5"ip֮1Nv j7`uo#zy4)I neaeaC`y ;BLDO!`f<|AX;rU՚m׍yw)؛a}o?VMz{v\3B݌/`6_\m?^mzln|_4)lW7uA/1/vҳ+~93qbrkY܌erwz Ǿ೺V+k rAv03r{ŬSw2eA>ix јCRɦD2Fc9uQjJJ}DazȠF,p<l/Ȭ5Y@hoJH uOtuKÅ&̎ZdT E hmi}IX}m;5 b}4=šS%3Wj@y\}(݉IQ9^1t-ͯϜMSuiI|YsoF'a' Y8 9ܩCd2њ1f%u7:YC%_*kL7hx)׬M>sFi1 U)0Q njK* d8C{qvҶy&U;~MeS3 :ו*$WKATC@5NcWp|{plG%tl%v]OTeA~>zv9oD{6pf" *ҁ@Enb(6%gTc|b_1=NK!k+5'Ჿ.2$O g;qz1ZX Vn;[5m ;O:ꔒ8^id_0n'^w#yhu'++of Lb9`}Z6l;2`lH6s1zim&Gi5P?tqs~"z^t:¼F d5%sp-;6cr\UL a^ H͇U8CIU݇VD͠P} !1D,2(. 4O|J0ɏu璅'ȭ/_#8v5ZK +"0.F ;[f_sE"@J*c=q2&ܥCPx1CgAIUWU.uCfZ[O=R՚k"kkȚawk%T{.U驒!@լ2ATMUl138bv5g)"TIT I0QN#UyXԣ1ũ@:z B#\ /Z*AW*4}տi"j2uZPɕ+nk;d0ƥX- ӊLihYɵ=6Y8,1wL. P5o ki*@ޥ ȺH{(Ŀ(NnFw6muY鼈heD;ݼ bJ%lbw(]Q]Leu;"ݙ'u VPL5brwYsOu_OsjQnh)f%a9zYlK7{ʢBWbry9͏fy;NY㕹!/1\WS1\eQFj;YKypyŜojbv?XN9Yw?cyOoN]XkQNF'71!*nMڸQ\'`ɰPtu;#Py" uB=d5*AZ;Bh'՜dO<+MǦǦGʏCwlj+* &FSd,ؗP DGR#&:"8/Zۨ$𹷂^S$xYe#5v1TkE'vS,XYtUaGFM 2LImfZM\"FIʮԩNhՑu_Lt?KttXOnG8_pNpN[v0PNb R1 hPIp E d,; 7_S5eh㄁G< 4 "Ã:Wd\7b<7fgu۳3[r&UNîǧD ̆5;b®RGS5 Bu@(g y :إdq;O%;. ]G0Ǎ^&ϑ3, K: f ,Z &70Q!Z5я?;`:/ Ϸ턷gKuU!Ϳ^?sw0>A#\":ӣ%JgׄrF3bL'd ޼x!֩n]ndZ ,孖2u$a:ޥ/avwStqďn b|fB?\tצF;'ޤ0l~2a Ί{'+8qE'G0ނͫ\鿏J})t.gJ)} (fGhNw7H[ rTmi_e֭\օ|*ZSqӺƥ#ZRN;X#D}[bFs[:a/;o?,xJ/% ,/ ޜ2e_A:<41M=a61M?ߏ[)}"o]f 0*iyQɍ5EVSFdpV8hE#%=c4F ,RZWFh@0Gr"ͯº_BKٿXj=>yMS(o9p[̸7g|rmϗ>O:xy|wlI1yG>|`c;f>eǘy.q&x $qQն1yJV(;1 ^ `=OF`:g) ү|s~#QR`&k3huD70K^lWww! =fs?M+۳Gl96*A󬚘D:d+K`K*Jv'/.&Ȝ Mr@=yB|vŇN*`[n"u2EQa?E[_ xA/7nQqM묈򷀲Rf抸 ˾m(h6+!Y#"xZD/!*SUMΚI`ne{CORՑ(Ri.p%rLz[;/JbYλ1xz!s`mLZkdbW Ϻț_D)@g'a*_$UH|T*N!05c2Ce;z Rh$meW9(}QQҗR"g'w91aDKvpLEI((xJitn}`yF9 N:tXkΪ P9KĪƑ.m2#68hAjĈ2gĠ g,"-#X;Rj}IV`[? Ԍjqb#Tn?[(0f>QNqv?#K.%xJi#;?Ɂs,(Sc `"nߐgJRm˧GYS}Ec\\+p?w\m !j,x-sJt1` IS 90<]T;2P&Vd_l:ٸaSM^@ka~ɠ5(rBԁ` scτ3ȸZ;H!L^ `E=8?ڒmeuy ʐk6=f ~cҫ 1ny&z1٦@t$Bz Zy Z=/bp" Z6ֈ# hM x\l\ն;4mGvϲL4EZ;wl1Sb W㷷%}ZÎ.}/e3 bs@O~"g#OE "8$(%; 9Pc!Әgr04tfό7ΊLN[,l9 ~TZʁ[s0[s~i*i'B^x<~yF>o{9(DHxC0/~ݞЃA*"O<%7t D-K.'4YR3^eH5+Qq_B"uFzoQ}vdR]s'҈LVĘ ;(;@%b=i`e (T9 L$,!:4gq8Ų`/%+={ *&R6utRܘ-qX)RQ:X_]>\5:}u઼R *ă0Bk6GqRfIENOrg-AݠDs$E3қ@sR]鐷JJ؟/+AySϏAmLJܪ.Gg9Q?ٻ"pkg7/_m8S“pc G% 󗽱ml-f4њwV֢ۛ'f_>]}ڪk8]fp]HN8.C0]m98LY'MZ~]m>WVo0XTJjfktc9ggeD(S?+QX5#G^ҁZ^P>#`~¨LO1xNʽiϰ!b^{|#&Z`!ƆQbc"WAeɄbE[`s}I1ƭT}Ãbc)Oۢ!y1.\Q(Ę( ČPfq8{ 1&'uz3WpbaM % Kܟ_wv Kmmm( -?d6M!A:#jnXcw*UcCFjl;A,JHƛZ[j?e*KI_hM5ѤWrN^iP;J 1fK%DP#0j s`ZۦΠb\S)BP]Eψ I 0~bYͅhsBX6Z )'Z5M 8+lghaTSΕALrl)#ZU)o@%4䅫hNMJF&q8SQ63HY<1UtŹF,.!/\EstJ qnX7E)u˕A䎑bE(*`-XVBC^TdL,N@y|qbЕ %w&Q3?;n>\5.no7 I V/|饎xB fLɂyE'/Zúɔ7x[d\O#2{J~%E{#AIin}4_T_̩ "L`!U >R@)YUTL+V)WD2륏fo8]fW*N5֔ˉP;3)*uQִ!(#L{堐IF&F))a>KLkF|^{ϥ2zWĢ9?U<ψyQoyV`̿]$X 1oG#xȹăO R11Ɣ\P-d,zXi._:eJ߂7t^}$]IK&%(]S gJWo{*$OYSq.坠 Xޞ컑l5 5;<G#.$% ى5w $b8\H"eb H Hq'\@Lm/]ζ1.`zle0[]0wW"}xv2J]/NB!kR͎Ftr; :o&ɨ74JII%wr+ث@da a"@~yO ?B4=U8TΠ 1O wBjpx1Ly:`viga$17sFq{&/6ђ]EC,:D*k)/4i*tF1h]3'u _]mey2etv٪A5ʪIr=;;X40%WKvuफlXZ"MSph#VO[^/r675tդ ~@ *O¾_ڿovres6w#x}_#/kАY:{^ѰT^ɌQ>չ2R1lF?k h W o_r@_(t#ﳍ"U$yY[Q(=0 kQ$X"0dX破9FyS]fl& Ærp]#jʼn0#?ȊelpqSZVJ|^WAO'l9<ѣc].F?+q9h:FXٳK{|t 6G>;aH,lBRa$ԈY؄f}M, ac̓&B(KjUY0\ldCϨ,TsF_UB$H" A JUSFQU[ӿbʘVUDiabܪzkš̂#sZL\\~ qm 9AC`Y1l 32&)f( Q.!EҐ (#d@FTWW68Pgx p?J>waȗ ѻÝ'B?ӿ=AuvU$`@M9uI9[3Ou,2HBH~x4R@\ :?qu|}.rtatN;=U>ߒ$Zi ^NߞaB DzLT*tuЪRkb@3U@Qb=P Nx%~Nzyo/~t$P?^R[ ֜*  * P>sO J0\/}}(fE6R#NDeU'~`LXبaوEdkU(zkıbT JS+j(Ed} Ћe+zCGۣObˎ0h|i(캫.~Bd:i&_R'%0\/S3 DMĎ16*)C){Q3c8K rTDݽyQ0ilSwq]ၚS豳3}q!KoFfjoOcTo+ZqoI$5{̷+,ན]͗/_.(1w1eݵjsg7`^v,ƦCO@bzÌ-K0z] .OQ|=N\c2epB/^ %ooS6Qqbmj4s=9&*-BV̋ S䣝|)U|Ja$* ҫ VAB%]o!_4Do^;lQt Ӄw(5zFpEȼ#XB Žl#^9\C.!0G< ]iAӄ(>wܫHXyaTΝs) >p,"gۛBKojWQHI ja,ף>6񆦵Gۿ5oyœ /֯c?>8gy:gy: T 5;Pm%w~a16Q77G65= rzs[Ԯ3l#7كj-XWo0cQguZguBg}/ {Y'+0[:~ '8xnw^G'1A~Az VGF{wۻ\|kc)濏z3~B8[`l2<\ E1AIga $A : y@H PP7P)IkI\76A~/9Q0 Y?so=>y[c0[,4&RaU5oR dC W_5Gߜ*Zhy8GDSHM {_PhDZ1\+NjEÙR٫jv!}dqC+`I䌰»re9AF! bpXf; V^xdL *oU5h'nˬ'BIa6̹d?eY9pl N6N,PAをDиz ɟ>)R"v6=gb島  \sP+e-@I`J"z@ = ZP1Zd@F`gWPd2IL&XOn"(|Aa1t+iI;TK§X#JL$@]B%)f]MnTPw샐TӉ9z[']5{{{ta-*>N7糶JfUy*q]3M5?{㶑b>ܞq361l%0ͦGII=׈IcHldwX]K8a˼s]N;~Q pxzG~T(!""B .b<#Rȹ_js}CU;f.Y.TVJ{5q_+DzŖjXm_\}WBl9JryŝxF J_jkjjݹ/8SUVGgϥ9#[5pxl2BTǖl.A`5x[!QN oaF&1i 2g*T%t*T].HwCdիۆV_ UЩ1Wf/!+2 z A۸}W?右­ˎ>&^#A^t>ڇZS$ʗ–wn>5Nb< ; cD bcg˹t)Cvna#Jc#biDGL՟ܾ:t7.v37_hzopj>-$%xn_l>7+պtЍ"ߣ|0:$U? zߦ|M&6m <)ʷol?' n|^7$߳7l~j X\$s,tx=ՔZ93J e^"Lґ. yh9Xy%5;g7Zy!8z|pp 1,($"`0) :"`}FhGI%SFV]γ9vjid!͙^XPbtUXv c^ (R d+zL95oI*STop#=&ipzR:l^E+B v1˜m@o0&P8\N0O/ޡIt}L\[0Ѵ-/r.aB\b {5AG?s= z2X*4&O9ECi{wn=}a2Dfޞ9C% ֈ!]͖&YV{Ulg| |=ފg |WݐOJQv:,Ӝ 6F""c/#CCR3X*U_a|7z48K6D%p{le1'5 x6V[7SG3Lr;MD.-WcߏvAsM^8>ݗUeqoM4N(]M~7ۦr O&͗EqseNAsBQj\w ̑JϕZ>ZK@^5CC)]̴pJ~&10L2hƏrdΦ]tg޾Nt6}$/R0cky &$x6K \lB`/^N>gU\tȷSgSl5NQw~ Йp^>Ama-28SʨsueogrQ;Y'e'6T?NRL޿]W%riQ+'Qrr)ɜ}> )wpr`JCpbpZ WϊP؄+ }^1lr@*4ݷ&#'e9iukD2wD& 䢾UqXsd]qnK}`BwKi{߱#pHeN^U-o=g)$/^+,"d6CwRt5M3jDLѦYlp5mFl=ґ71E<<W2j Iij6Cȧ(fHG*D\6T,PFccN"#1E"abá$T4 v:_X:k̬u+Df#tXL4qN O-!7[XP:Ƅ8r:qvNhK1% !F8E +OñOL4ueFZZX4c).5._u< ø_?_I ._}<`(:eը+j uʪYe1`Z(Z^h֥5^>k%Qg rXHh`JKPLX~n"a>hN[>+1kdSK(S|d}+ 8<x1@ cfS1y>i!q#7,GQ{6[i 6۾pzz>F\(;6j0/&n\r @v6MnnRiB-+W EL8-j7EE N1h"s~()JWU!!߸)U}Fjv+A);F֜ #>uk҄nUH7.ed .SLbGtʎQD5<ͤkԭJWU!!߸))[TLOHz6a3 [da]fp2$l٩68Rﴰmoq> Aq~9BPD Mv MMGc:4Whrh*N;4Wh8SSڔ\oLxdL3P*t\̖-C,ԈI)N T8.[~ʼ)?SS[)F?;gHدy3HtʘrKJDZ8"G%K SZ#!!a L\6.=0P<[,_J)ܥ]f֜TV"Db|;> =BtIc$]?L`zPzU3u ͟a:z ҼSVZpMUTRL5}&d<%jaM|}NžCQcژZ"gXh,qDRhu> WU84I.ʬhQO$:N["i3(%tMQv$O=;GHpl'8s:kG`4g IQK&yڟm~_GP!38kAyBIYa;NzNY-"$4ŔJϠFST1:^1hv- IwO* }uȻO `*; X3@i»xpiè%ծd`dWA-}mB?۴6HűRJCK(6[鈉bk(xkacAb!D<"T";!0:z_!&e{hu_,P h綋yA V{.|Ftw;6Hz} 31\-%ňS$b YDb'9GȥW O;*F>Xge w+Jv)0r!r0E"\(dWsW0J._$^^`03g4"$cRiv qjlM2p4|00#,ӆ6Y{_sPʐHߤDޥP΢=v07{$t_ sU[1yϯb׵zZ3-o]/G $0\-hq&uI4'I:Qdnmw7٥>hiC^`?3>NY0dŸs.E¨rd^r"()U34z,F=kz!ȝJg!Ex%Z Y j0Q[> YAO.1Z y& zhU@B (;VY5Y`GB fh̯{uC>e4\C7X_j3gp99eEMbhΜ.hkV;Hqfrq̗cp.H\Y:۩)c,[!38{cgGqz7Aѯf;KqKt0܋Ժ?kF р5]Z j!fȼPmAaMJo=yۘ٠Uk™I`mRĘ-^iq]o>1.?Jlvh]wys[C(lļf+Dupᬚa?{D }PؽMl31#}6Fna3 9' v/ Cp鷒aQt}C,R XM}*Z+kUGA(ӻ&k\aL,5>\؄)`d!3j8}$Vdߵ\^;:%E~_.\?ךzB 硽Wa kitB4Gk*NWgi< *UsT"r-}4$_D]A]Y? ȯl^&$snPKZdǏhF=Xl>Ҏy:+39KC_Oa)Ŭ2,Ed⹺,aR L&01W͵ՔSj@`UH:FTg&NblW*$ED`rqY2Pf,Sdk8kobA!q!s.X,$(j!bg|BCvSqLiBQFŚ|8# #+F1ɘ3Bz" \d54A<* > +$Jzi+@Yhט\j$`9PW(bM2I]kL}4>y86Lql4uhI^F= be4s:Vt M5Eb5*& ՊuO ՊV?21ZeBzP_9.$g`U/X_n%gS;G>i@X~(Z!m㒿ګ+eu eHg G7O.?E{B9ӌ~㻤lOдnߪ]>=Ƥ d#I9rG~nO7zx**R L%% j$GMq˂{ʘ)ˁp8)'A@ь$ ^X's(tR1uԔH$@P@`Έ QJ !IIXAII"I@@0Q 8 {b.:m긵&vsNn8co):*ۄz3p.~pesw%l1fw,-Q`䈖dk$F88C?GLԍ^`rW9;C-&H \OB0<?+qxˢ_,CzQ0]moq嵮 E?;>iCe_<#)#z\ uzNr96R)-a%ZwAu~ΜA cO-cO PN8醢vΏM9g(z٧5e) K{ȓ/4;@ +vGz2!!julO̓K8hǍdni~q͚cxė̓~x>~yF=w{$ <%'v2*#]]QDt(S-9',ΏH7,i7r$L@GPVF}nEG;tw!_^s0wL>kKs|P\T=202"dd075wK{5d$H$z,4u[A"I$8T"pb)G0 ;it~6R&buqQ?,M(q(GDq&_ФJAڊ"9O @a$ d@p>N@A N@s)$"!SwvQG5DwELh=ϙU "ydu* $˗0mꢓ}Efu *pY//@ 7'*&buEc;YB%j+_푑&z>zGFG>s-bomU^ɫ֙OpCe7ko a%Z1o?}a,zz\wRWc3~xLl]1ow}7mfY/6"&C%QGۿcL;(.ѳwni^c;]TI ss He_>ǣbXWjehwGٛ2a6" 8Y MD4p<5}N arޡ ϧ(Wv[:^8JeDXq'S?  GB]q~Wu)Y6[u ΥP8$ 2H M#yؓIxtO b;b\xL]֩(qTmq0%5㛜:oH$asI& ORa[]/˵,UnE Ɓ(}Obɽ(I(F,섎>a>ի>rCF5V~7nzFZ _zmA+gvnk^B} ©GBb}sP / J ~ֳrb"r`?t~^kt?y<*](?/})|b1ZY/$t9 A/x$I~@)x}_s4>'9P/2}xҼ^d|S̬犺^I恵s-n owKJxz.ÕZr#Qg[W Ms45.#Nj2wS+%}ly7)N *$4ZWn^/׾S<ݒ_֊5mQђF( 6V4YϮWSFܻƣ+51zv[Sm-Ae/bZwK]Gݺ7mSW3倗&a{,q.đ3y{>bIcFuwF̭n"9a\~q0r2I2x xތuN-<į!/uǛ`~/ A0zb= p $b= 0AP qm *kn]M҅2Sh_St3h 9le> 6'l| 6%=% %50 "MlvXPй <(j5!yASQOP2GNuEk Iȃ N>I*El8\䈔TTĪw,w֩~1N`KwYe755^ej L2h$=J>x8tOW۵7 N_oWg% 4wmuMO Q0C^rFhfN PbQg:h%ъյT fkP?qc_kY&@{7:ڵNڮc sL5J{]Ë鈄}kkHDPV~Q3?ˆj \7y_;Jo ΃`r".V֓*1 !b_ޞBWMο> Ln=#BoIq]tjkq2sS ̚a]ИK!\u  g?hcEa} o~4@-q g,*۩KvP_ 3ȱ}"&8]NҌvA4AL.ɡr8[?f*_Թ+dllF sN"V;9Jt}+_}{PБ}XLZjA&s +,VM4z\>YGѴC]r;yӛ4خ}+#ol|>y9Sff{#ԙ%759=b(M -q⫻ﻗuH:Blι;Jr W{%aFq.fF]t*S'jX56YڃvoE?; eҬ B 8w{@ @?{ܶq_Ki@}L榵'ufXȬ)ög\Rɪ&Nb p{i@W8i .kllGp%mѹIԒNLA(*v历h1ӁXcL:ReܭFyl6Tԇ'sBTc)i {Kyv=\+i!{0e_k`VRK뾑ܦS;vO@tA:zB``ƻwXRˆrj Dĉ#IDDX-@-VZL,3 cmlk@n)suQ91- ;`ð-XtQ>DO#JǡUpPqe x"t?0\ B$QʉUУT)lQ/ EzGɨIGV^JUXc;d~0Ϊsoj/{Va;2OdhnƓ9( 5@~ D Ƙeb6:8RH(J"-9,i;Mg#8N ZAS"Swcd2OW?|ws\K܃Y)/~x?E/wSwo7|s2Sd-! "-Ir$M4v$ֱ3 `14(})6TVNZN8gLF;Q]댵>:4>|h\=tU@ykথ}+Hb\sMDg7`T%vHouϑϾY`}//ovh0EMkmߴ6&L&H!IUNwPBJUЦ춨fBN/a0ѩO2/3E1x+vXI%%k75v+2[42>#B̮D6&+VlR??^{92KKYaFp}.^jzA8E|| V   A fHV6C} 5FAכ9aʷa5 78N, 0pej~wN e{53~_$M{A2fż[֤yf?A߽7W96T!ƻ4?fs!9zĸԾr^5Uthv9pИfYIݢ"+1p*@ -Mq(_|Yks]bh8 q{}U 3;:btd&-e[Ά{6iB&-g" U'_züN_Rv2@>ݑuK{8\V^8S5&jܦw;xNJ'L*.[r۳MJ1 H5 sgVDBj`3!Q_ LgPoC <}4SYVWHHlf4ͨvk,@>kt+uKeK` F>ME{[u:6/fa?q߯G@(^Y/"y;q^N׾u؉xD}j]h&K*3cJ *(ѩ’?=6o3f?{9f_E.b910 xZ|ߐo E}I2Lju)RoyR -(N} aĬ5 $Xʖm\6+@,n@yR~Xq%;X WtOe8=V ; isyU2TR(jx,aϊIg RXrW)Rqa?P;g@кzR-2$pHPџ9ng8W@ ?騸qTUu[h HnBoHk;E{l3řA|^8={29Q\g۟^zE)FK\]nw̳~E ""D_zwvlq$Yf>|OO/Vزy96lЈQ5w{4 Ta8[%,3_}憎)5IfFxkJ#Da-"_VF?"O/#I9b"(" BJc_r6쳣kT~vLbKy|uU]UjE8i),g&n"`e| /_5֔Ё;u]gAUލiܽbʽWLةܜޑvkbkd注|'Os,(o4L-g7ϗYjqN~%nl~~JcOuc/PN,Lo8݂:2<׿S\^j%E^un~/W#gLYN{Cw {-K, yhxg-5x]o^\繛E ͕\kwrJeۍdXqy} a̻gm/xQA46O8cRg0%:JAG4RT$HYT{ >Wm IYhA$y=Iɞv/=dU tFq;&즮ВP:`Z [~}&0eSnΉZߞm_;ZoU6tz] ox3L&v!RK!|W7Fb5PcXy"VF>IXڵwبڔ5CڀYTRnuTD Pb$`i1ipDEm[dBK?P6IAijU(G)pLcP8VT}J VH&&8I|G+br % * N978 0б s`acsӈT66&yGJEsЗ! vu "uԱ=_΂*Ec6a(f(DKmOZN43'i8'hlt8݅vDEaƂd*$ј8 ,oꌡ'\ $Xl`IWx0,v}Q@67y`ݽM%NXmKbcbPx4s8ݹIY7II VP 撍HUM:jV,#&d 3ŇdY4(6)iw3SĀQq 86Ê)EӟKzd ݈mw;P/|s XoE zyy,L83sfIdMg9?.ۯ`njZY.E?,3L?\Ir7N|N{wd11l_4rr&iOD&(pLlț:o2Vebٷޫw_`k/b$%$Vq҉iG$u 9=sc `vh.g]d^E͹{KmnCt ɩu+e,4&I+e|+KTA #~ĶoCJґ FbۥJ*6N&N)fP)ҷmDD0`p$ǔc|(]|Ki;]눖$"hDPmiŸD |%  rYJykNruj ",wQYh,0K|s?crGJo񨛥Jqf%H|qͥeЄ`BaMѸ/)v;Im ǃ|''NiIMf)W4nV"2b6mҒ{T#T0zkDH$VRD =.Ԍ|Ll{f,1D7~@<yNG z~[ۿ;3Ǻvn\G S?G旟fz9V-M#-"GX P:;Vfbl폖t{d>yysӲG?O^q/?Mljni<=e py4MQyt巕c/y,Кh1[xn;yM\FSyK?S>IR H$Mqb"m\b r Ҩ,t>|#>_e4J7/{l<6{g.&ݸaeϑ_JHj{n]sn|j1F0QFqVUa6RzZSibn5Ɯ0fzaRsL::I/LHӆfC^IN#+[gѕ|MCׯҢ=ꁿ]b90g4N%W˻ /Ǘ3 "FV; ~pǓl)`o 4_[< |zυ*sWWp}නi\~>Lǔ9:+6nyBRd019/-qsOn#{N>M]L€wx4HKs;9(Hk{I>OAJ5f$2#&i%܈`J7il-_oެg`$m`1\E-8(="zs`E=`V'Hj }3a%1>WX*DL%)ѡaUQ1*#c1C':ń"/ QKJ8A3iI@D`BЂԜpnGL)H h1DWNxj۸KD| *\ߘINڒĨFb2Dj$CwU*5zkF9G<hQu`Kt\8pNS  6[E0Pp[0ư9!QSk(SJEⲖW)js89bҮoQ{\S XEKН8 b~D ]hd(p<\;D<`+ &{"AIh0u)kSURBV*irDˍ ^c+0p2"yf##AFh;s8ZYy5 (h Uj9%⮗|0=8,{SdnmFr=_/*QKgmt5'k<(q)ri1>9x2`MX=6@R<Hq{h2Q')n)PU @t~,YW4P"Z!8PJ ,d"Rҏ. Q4䟈QR<.XTHQx4溓»ySqUK,fͱtKWXu;ھ{Kgn3@1bw%RTs3>Rވ1&ZyHTIT1c$ՙ'./sbUM<`p1a ъ: $0RqptE0]oIJp7,޽_SPFhtͳY3WhL>@xOg% xa* 4aJ$zA0.XtKpb,'$2Vgt9؄`]fNxKh¨L)i58p^ø}Pm`JIٹ6 'DIFȺN&D & +kEQ3Q]h%ݧ˫_Yx|3* yori</*0<9UXwy!k}bmڜRuy)kPYȉG;qgTXwy)e{,o`G;g(sG;Q]^}p mbgmܻٻ5S}6늵'~ &~3ˠo<Ϧx6g;Zى<^mw^ޓG{[g%´3xs~#>;,KDj:cM'g%ҤO͊8BgMYvNV$Tq`Y!WO~ fC -'^dη7-YiƒG|ͻ/'\]:X/yϕXvJو.\Oj%iwsۇqDLPVlDWw>>c Ïd# }0ҬU2Hk4RpAIV =R1,^iXѨۚڒN \s>W6/?H5 N/2Ts EJ prד*tq~w׷~T>OW % xfIMc<=_i<9`ݟ?oy~_ *~C{2/:"XRN;X-l mVLhukBBq-.[_g MJo#ZRN;X-m[bBs[2Egͅ+ ݭj`eR&QN^cJ}t2˳>,( ,Q8xːB7/_ d1s:i!ɐCZeȾ3 9rQ쑲qiJQ$U&T0wȄym2A!v^A^u>t+ /͕9_bX_NwܻeJ}h0Wg饉#{;Ό;wRG,uD:vDZ>M#H+Ca H D-VZHrZa,;Jc$jx|>xs5P``p[,;]X,Z"*6la"s)Ɏ&s{3ܼAvJn}0gclgIdgw[MI>ADJˣm=$ !"D Kg^"!DR +ǵ1w+)dkRrڠLm;%Ii5w?0h IW$@8 JbFѣd](0zIe1Q,&bڈB?Zmw)P(UWFAE6*LEO˴ t3MMJQ"(9M>E9 Db.hL\+u""H(RCZF+\rYd8ZVq3\0{8 .2>&Ej,~uXۘcñF v5 Y0g N%.7KJ*#S;pL.xTX!C!st(9Eb6MU.8TP[ BXIe%ׂk%ZZj{蚓JFsrp12%ŭ 5&GLVeb=H-,DVy`Ӻ[@`6 vF\fa Aa>90p>&ƽ_jGM)'Ek \̣ͷ|}S#EZsM~E^z~: 7?/ .A0CFt"_C_ q<ΖQۋ9Ahg=f:~;/%~-MG\~8;0;8ņg1q#-)ChuSSF&r9C`$3+% L-Z#=L0 L/aVX(p09pԈsm H0j@Zouy!!93B(aJn95G'@";:y {Чw._rRinn*;{`0 "o&^M4^0|JwVX}N4p>^m t:>8P%Nh"JA阄MHBє& {wL=iHj6o^ I$lB+H I3WJ33yt2 ly4'eIj֟d% Ia[6!ǂI";$MѬ/"QAiƟJ<`I%9F0ܤ7I^5z-{"q-Pܛ|3fL2z}3f +S&}B&IzE z l *+PD TiCRLͱ%bI:T_ @A!&6Kp%;Mv'i' 'AjCI-HٶSm:Ȼ7S 4b_)Ʒv9Lsڿ\_--`l9GT0B&#΍ Fh$RF$n` J;ݜGv+V[AP }TW ,!X'gQHjFо\Xm_-tdYB*CR2O>xlrL~^Br^q0& *eh4“(AD$u8 #.yK_]gMp%s 0yK|_"> M/n/R f0fe0T28QPHSP@Dtj]@hޜD E`)ĵx5֏ Q#iUBP-<>7ZjL*hҳ`={|+Z\F;Qh:x@Q1YpSO: o+ߢ_H 6"_uI->\4\pΫ7YZ^?Kgju%H%k~f$0#cs`lTX$話6j/*^򃅙 Bī^}&j`9.ޙ׳d^ϒy=Kj^H"a i,B.b% DJda3:x!bפd29HZS;#MjqYMk.s.m ! P/0(u^E8uJ(ݳ>@qUe6?ŸKYg{| (\E+ gE ;VZTw* jA\a}~m_䡎ueԔ AO"[@fNV\Q!,$M"em%%UXEXi=h+uUku1X'ց#DQJb*)X{%c"P>[F8R#uu nvwV+a1nOÍPH͢"(7JxXrR9k-"6ҭG>Ѹ¤0O4F:ut' ״CB`O^TBD2/C ] -VӋ~[s?fB(Rd>ő:ߕy{ k ;|׍ k,> x4+gst6\: )l. 哦P8PX[, CU$ `kc #N|Ҵ@N6|M;xEm}ϡiᢎHR$$v l$(U"(`i`(j'ZThɸ4k ֠s Qh [%Zk'>փ#n&Unfj$Гp,=k#<Ӗ<0-GAuD"N461i(Rq̤[^d4HIE┃޻,t3%L<͋h EUMn5D9DUm[R3(iR+S VHQYF2% 1A;THD 55{+w:e6k9e,msƱ@B Q[z*d| QXwxDRTvɒ)"|u*Q1d"5G7|2UQT3<|2٥,{EKO.%=1z+ӷ`xJwNu=}w}2J'IwN_NHO) 9Wo [$Q+O`WALARytg`,tԙ-@$SteWmut[-դT~@|1#aѸҭ0+2yw B$>=W&rvYi˗[?֐ ]HNeqpAr;en!@Tod`5K[yrګp1n{Nh$|vA%>+e \fDthu>M8Wl3;_ b8ż`QXE#BhJHU6Ȍ)K3#@} S%*.yA0A \C4dZlҨbl K诲4f\KYk(8@5@RҵwA6cFZmpBZbrKh41b|8@$3Öoſ'wCn? T a7YvYf&)6ӻ$`@Lj!`+G+К,HEY>vPkU=ZQTo5!X}I gF&zuYjBbSj1sR l*'7ZEU!Ho*-] 2uCQpOƕ X+4 DQ7#11E$@$ v(R$>rI۰թH"?,B~:4EakdՁYI|r>^/ 95G|zxZcZ{VUѤ8G EhɃX#xAh٘8PR飦ĪɃDE'^c x*TV'9^F$ZqPQbqTx㉽ $N !_W$_%GnBcC"fD1D*"ʢpO#av<ʠHBH!d-ʸHzQd[j4L"ARq7>FLrMhC@Mf4b\omv_3d>3gu(b[swYEm'| 睁LC~Kms0q|&^}#8c+F2a 2o3A#Θ5*ӆ,xgPPȂI;sK}چ}0{|8;Bx?29N^gi.U^}Ypgoh>}2;8g˷wӋșNq5c5䥛چ&71F-d `1ɛ^omyŻ,LA O{_`[1+w_GW7QL_Wg3ÓFԹ li)O]8D5)%m;h#Sk]k?&{̄Wvc&_$bۛ6[+NѥD[<Ӯ\Jt.d`L-}?4 (^@c'^q 3&bJ('TCr&kWy~ַJRhÉ0+(EnICqmS z7R$|iLr sׄW<;HH_K"֘+H04_wc0- ]l&Á\ }\.^Q7U3OzC;:0"?yWd81f>gn:)e$#tOg~WE8oשcn^ntA8I]hA9.L7\߆ۇ#r{Xi3nF|D>z@x,9C5^k]fUi'-U1Sz/ qڹ? a )Ƙ;8̼Lq-^hAԶifM+Uiu1zD+HZ`U*Wt+󠵒Ud0ώv"nY5_kv8a%Wj~8K͋LvUK 2Z)5%Q,c:N:uDV=z sWî`o0J_R{Ҙ&Qx9Y}-n>N/ؿm=L6J#XT$X%o{"!ͦNfVqHUްuTW 88l/7Խ& b:A", 9ԦkR2*ZgQt[ ,,uuiE 4e0Mn/Ӵi vUɸ/c* ipg |6ྺaKS, ­((Vr+rvK-1vM.@mXs gYjZ͖Y9q8c IAen^R:72~lք*JD28*}FX9J~>1rIw8 ryi* um9Qe[}bԴbx?L!8VDVEj)Ys3 nRy2e>F08lM=.wz0Rpr<-jb+y"w?Z{UiX+tFKʌ7G [oiJ2|-8. !O'6o^hPeO@¢;ٞ0&H/FPuC/ũb{q[)JklԚx@%׎x^NKőt|,c8H6͋Tb\cLHn|)cJ6 PwZއ'hIu)CRxU0 >D3[&Ů'SSSۉ*!VjQ&H!f:dٻ.T}ޕFr#T ~XxO x;O4xvk,^}#J#U"( v(&}`Tk4(K\p"Ye w3:<ԦՖWWjuvg;uξ-qű^$d.ԩ˝a 3LǙ#Q$AOI'U,JέOTL),twj0z,Ͱ?K8svDZdvrN2UGEvG9abՌ$~"%|PCc@"AhsX0j1aJ 69Y*%$ ܢNv@ ZeXM @ദ_;U@j}Y&B*y5HL4j`,jPUk8 XP 'Fw<`zLNv_fȻ~Q_=3,bFLx`jffN5>_H`2j^ڶ[MN >3N'v]!lzrBؚ)(X@V6u]!p)g:;hT$hZ,!f=Gl gbW?hqkngr"srBP(9E:9ED mAjm)A8#"x4Z)`"@)OwK. 1\'S%MWxꬸaԧC. DDbnqPTAj렬QAgքVwP&h>8[q_V 6r ٕL x۩Q=62B cd%3qmQ%9Yko&aG}qN%WbY[U 'F,\X(#!8hId 2Jޗ "AbL j_sڨ-CÝE `3 ] \ЁC1像BRCV܉sKTZ)($8+%K^KwcS0jO NGޠ?# iwB[ϒ06z4Ryq `\EisXf [4 yX*e;eڎ]Ta݃qBs8@J#Py45t4N#ҳtQ/|6sg[0 -jY Z 9B ΀j_þc3pPB5F$0$HѲ4;9;lE B}Wvװ9:#c)3C=fC3L/hN۟}ڞ&s@X <))n8n60-MhL^KqF#D )܇ᨨ-z/Y,a Ȍtދ%-2fvoEqչҫK##޺(*(kG98$dn%-[pg[osÄ_]m.(6?ǵ-s2k+`}'z ni~?ϴt-?Տ~yVB6Ɇ𦻟??.6qU\JLJ ?~OBgW6WtG/1D{dӝ!P־]m<`Wup t;}A ЏIP0J0\y%X1&DeRD SsJl;֠ !0g1󧢩V|m3xypotȑ)oN1;mzݖ7mYsè6QÈ&!m,3m׾wLz ;Nug{QۊgW494*Jh6Ytl@W|4lȀ,3F: U><2Gp0e..4J'.KTP \KP`q&H)/m qn!epocg!Dnn}mJN)Bj 8}Z6b sA8-gM!))!$3>njwZ@$\,?ڝ1b׫mO1;>Va-4 ;S?e`_Uun QbߵLRr ewU<'όʱ:-T9rTqM-'B}_tWxg&kL<1 F=7֖hTQ֮ jE0)k^FA;Ub+|weڜ /фT,+|?6`ư]jX>y|/9 (qF7+nUs]tKR(5p\0@BBI *D-v:jU USyj+U,<\Bt#N'Ԥ\& ||{zVbA9:. (,mI':ĀT"[-AQgTV )p58OhZI.I2$Q$9H# ghi3?ڻkᇶv^ķ/|>?9O`~OV0;vh;I(b1=d r!(w?+Aur+D!(+A`fjT{r*xt6=PA֝BrGޔՙWl׍=O289s-2>sͪ2N}N ,@8""  "P` Gɳ0n*}mZu{in*zݤ tk$`ub:}.bV>ET>ǽ$r RQK"d'xZ% 9TDKǨy GК ,%5+"CS.N2 E̳י2jLJ2rl2}:WPsnB%sM%l~oU|nJ^[sU݋٤b]DF)b`vUa;goD-iBހZҔ.;<|<`dµ0'fP5t̏yfSʀXؔj697Yl,S3X)[I[|Skl.uR$2zڳ9.T0u B8?bgoJm"/b ۇ A\~mo?psCṄ%}Xv|Mh|_!C!ar4}i7*#xˆk e:]x_YW\!a!^ʻǛqw;_4L=ڻ_Y4ԌT&$&ܤ}Ix-MiJ-.'{rl~@>hbLC~olN?ŇlC7el47J:uѰm+{u xJ9.t7cf#Z+}0{W:y[hDZY|cɐs>}c |z+3x!^ rHw5ϝ/d1@ uߋXwaNˆx^%w0[?fk/}*:(8$q;K[u=˩80v^vMNʅ}KN<]"youCgM,U Npg )0_gS|,)e/sۭ&gn0g B&ǻ痴}XᤚD#@qBl'i ;Kt:2IIQy ÿ2(.bz!+* f`LZc`T0j|AcuV2`$p^ DGI˜ ή2]AQ?#+~6C!@8le/8'1xHvA>4(qH3IpW ȓo ~">!эB( 7 ҧ7 F't(fKNƁ)CZCO~Uioz4ĂaDзA\Ьt. !4yYqMi 3w l2 'pBIĂzAм`,ow݂H`tXU>,2fQVYz(?Nt|˴p9bq/XK,YرQx Yoq?ϳXC@[LvTVAٛͧC "xP< o*.{~ҭ@ϔ&FX'?OS36i ބ5 W~9h&gY}ߧCMzUb @ Q{xo:+եJzRy DJ_عYJO1웛{5^˭`b .|s^m7SFt^EtD)x\D$XjeJz4 Em#vI0g(Xdk⬩n_^tVDGT b{/"ٜ5Qynx3Z\g^)fj3/ / hD.M8FD  h>gsRMl6>?;Dvl糫`wsp qyEɹ|s;7u@֥C[x!L)}],/uŻId.:+v$wBknEL"!V"s,-pwWʑG*r؟t>N,9y O8\_^.3Xsc)p0ƒ@lpֿru "Em^*tm} ]I ThB?zǜG$$ = V+ *zxPH 7J9N7'5˵_v^ S0H) *϶.Gҙ7sB.1Sbp,X@h^h ,9IV@G;s(RS!zU%$y+bӑ)puYDy($p)DD-T9kT1elr:)jTS[2=Ds8p8x Qǜ2TaFCx7h{aC!Ӡb $.pD)qBZBE#vOʆl4~SurZ*sg._6O%XptXhrd'(ݠ|0/gq2F6w-',w&Z䖫N!iF>7EK~OH&.Ux1c lw6|xb]lFzȐ=ysjY:7,da\6+0`5bԴw;K}EL$HS1x.X)):N(5 0 V pl۲(!OgaJ෮?fm 5aҦY^ؙ\Z"*_fÖ<;dCS0Ev._vc(Oąɢ䈢D/ZAJ/ɦ1-sݑ0Ib* c'y'_.^m{$yqVAjow[d}Q *|6[0VilKΈf"Pws1Z&1pIa9YsQljo!p03E 9hSzޤ K{Q#AhH)9K s. D=nf >jJrϜO5T+ w$Tz2Sb&]^\]ϙAQ]ISt6p^|~p;+3;m4*AS ba]qU7{_ib<զMK6}˃>Մ c di"V#1q (=P ʃ>>֠ϺߔI4A43ǯd *ur;O*Bp)ȗ%Qi|cʧf8hY: `Acn=Հf*[J)MsW仱JJ)ޠy*R|Մwq/}3)|Pt+iXWaI`i\嘁9zn5UJY/1 v0%,R>na -#O*hPj ) rS#V-U (W&ON)QtE9M//>WnBj .ET˖7Myk2>}%rW C?\ц '%3xONƈ0\%'B֪r !.`NX;Fg)'a1W]='`wkޟ_KU&P 1c\{eFdKAr]l>z+W&Օåv$_P q |_Fe;/ZQcXu1VO) _6\ve;J{M:pwL}~9BsY|!ޜ D3(*n)mml)sRedA~k ,w4 "1|nLǭALgyNCƤW֌gTP{P;)W0wPB* ~ܟmD#T/fyy(Ւx_v0q*{TZ6 _&Aw%bĻtǺH=xSSU=Οk zD碕/稘5{wD\n+'o6OWog˕{ѼK7}fwRQ"2Xw vk0Kz. 0ËKNF"a3wh ` vK QLۦ>zX4=,58rU j\kђ4 >J?9ev"RGKw`}Ⱦ}-u\-p$\9:xG ϓm28˒; Yvj9WFٯҍ4@m A\s)Z?hp ZѶ ڸ.  ڑ uuΨc/ 4ju<\vݙzuWJ$Xd+Lc%iI#srm7X׹PsTJH2U5j~;M\ mPc0ak2uh nJb){-1ꎐKJ!E2h Yc wӜ)C F ɛO A;$Mɔ[I!aeA ȥ8xsy 03 iˆpGjL2ɐ\s~hDclliŧF[|Zˆ[qriQ`9HJo#Q¿*$*iFOӗ7Țᶣv h@D;s/;dy1(i}1btõȵȵȵ2ז(ؑD3gTF! LbBP8mv3,l=A9quvbѦ/`4k>jm)|cV :+ɋ_`]pӚc`;<3~kQPcylQP ̻; >'гޣL [g>#aEyN SмS2x)fT:˔oƣoso@6G qkՇj_!8D})6[9Y,rpWg҄ 8 DQ:$4` ÁPBFPZ]mRk%"!+lu"HkO60.F PZfS>>i$3tZF=ܣ4]MWӈ:6ߋ&U_(pmFyˊa$z{boW=Lᩪ%=}M &S~7şwbǻmwfkz V lŵw9BIxur.mLY"Θdb`~P ؇]*6HK1gf# "@I*U"v 'Bs(&&`bXIJ!@zi(.yoMڊ19҈aTuQ` Sq`c4e >Q 8!RTpF#$p5;\Y[6S`L\s)ݒ}cz?613={R6?}`Ln «&2Nl " P۹(}ahҊ ꃲT2$MLkXisph-&鉑W71 WS$73^,$`xZN˝|D; =`R}gP}o-HN`ZJ۪198-e=b3"cK_GbrfJRbJ'iiii/QA{[f&6\K[PYnM_(JwC'2 P)ewؒ4*?>x1yJZkm" u<Rs` $cyp,'.HeVY1>Μcn{wԔ4"H=Nܥ"EiT`}e\1e4ƼbnjslU< NA\R(,E%s!&$\0 i녔HrXC+ VuVw_Cm*=a4m~}_F,#]]7M:8Z Zs4=]%H=e.'I۴P4mS,:VIO_lq ++ ]ϰ} &`w'N8PFgDb9Te777:/Lt!%cqS |wL5"eGZÅ IֿlBfQgdK$3t E;(D1ڢX=/|]7|}b1MQm<?V#?Ó "QuV rOb4> UEcVroGrQܝ@;RaI_k WI2g6fEyq%8@<A( rV3S0Ҙ8,5xk<}gcz$ +K;s\۶두Qy$l;& la`?>Y-N5!Rõf8Ys]5+d\-"7<}3Dy 8]{;k~5ȅ 5ʱ)1<;g+~7 E8kϟ*n;+`5mST`-,lgLuLUsc<~{ w+ ^4}jqAonG` -#SӝM]DuXş_s'{Mh^8 + /%YO.UOKr@[SNwԑngӝ.[hD|,ڄ$o K7-%H4Ha#xQ=]Koҭ YdҒV"`1ܰFO6 HMA#H8=>ic ?QU J#D>77: V(e{o5ocSW!O~,z*Q<XdՋߌ~Y̑C&h$i7szvVbbUw-B87`Ǜg̋[c7mĶ6$g)5GPm\RehWC ssw oC)! vkҹ_$)V+Dq?(M@ڪ>3LXy7䃜XsNw\ ygWJ`}u87X2NZɥIaXD a\΄FB VZ4D95sij@wT)iI>fј44 ^%BFLk'ʱ (rBB:ir9Ǭxbq5Q kR"dP U< N_BNGm-|NHOҍ5bڗϑ5.3, I[v}`F.e,i] ڸ oA>Scԓc+5PNPhJ|_4RRqR( , Mo{Ki^4˝u g-p CX}ltx@IۀY)y@2:70BmnalFVi4Q\IXn.iɄѓG#}l-wsH<|mzG|y~9 wrz*f `n0n>I1?} 2qrh.ÃyIA2 R_R iUdIj%+f[LdHwN3~k/V!#$:x%SԹИmXi#LmTg^QpeRMlJ:g:>ڂw^av"iuM꤇դޟRPxUz(a(K0V<;s8 Rz#Wz9\>.X ),꯽G{z-רkDuȭ϶/f9,WhHR ʇdQRr #Q+FMhј.gtIdk)Sw,/hm:tE)9XsQeFdhAڡqRD-ZJ3{ ss bќ3"XFl͖Qf=FQ M]Ys#7+ 8fI#>aό_QU@ڭ[dv/P$%藶$H|H$yVq FhK6*T+椛Pp"N)U^Uխ!oz\OSfnhib%-(]I*泘ޤ.&kuhɇKk>%% mFqH!T#a%êzc>4[~!m GZ⺷:<'#Fv}a[>=oҠHSQZZ U `ƺAiQД.2%ui˝P!oHxa ɏX6&FZa h w,ij(Zj\$lz` OL!Qڣddr[*/ c$ׅ+BHi Ӧ.MVIlx C"HJz΃$rޛմGP9Ǥ ]LP=i PsP#q[^'5'iY1 cM*'/YN$@{c' TsӱR}A5\/K QY1Deu/"^ y"zL)zWSzlh__~sߧD)R[yk@Jd*,4ߡ4eD1?6F0ZJ_Ԑٻ~)B+r8:~ɵ;~v2t_FO?ퟜHR3c)YiMF6Pf+WF %ZG."8FۭV\Hr+^4װ+ߚh?EaWǺu ucƨ`,%p7_B!^Tڸ(4Dm2pH*HRP:d8ҡ]3vքokJ~2HeVX[Yj,(QUv_JN0*^;>}Uejg3k0L#u"~z:tY ߬!Te}1Dc\vW^3GҝAw&XfL󔪠J HX<8&f`dQ:{1AzL%>\>;%8*9 <2$ M3: \>3A/掟#P$)~]>_d)yW8_c+HBϱ^(h3PjYd(߾9)BkJDm5RURN#8 JUւ&h$*^TBՅ@k$CH(}kvp%mţv{{X D~ %,%>pbMwM7L5oAu (r j Y";"1fH+˨ȜWٟc@JVš<+K&YפEuxYZb!5Pb82N 5,q|-8VW$A@8j H?KV>V[ߖ>zM48;XB}o]sc] BhKzsl'4C|ԛ?XGs G]8BgWq?ǦwIg7QJ(P캘LR N0[8 ׆dڅ}*߽3!1;Y ˱F@E2ic0t|T:i)i'F0d]1:p-[ݕs^/Ϛ/|qӫ_S;.@Rq玘dti`l*%c`i5T/F}~HiNjvxOVӽ}v2 ԔD}o'8}qz,HۯΖ#[wxHQ:tmNE'gSQ)IRqw.G_'S{7yHHjt u%.7[73[ e-ԕ8FVhaz"} ME|#Zo0>[]L{lp.iУBSDbC|Ϸvn[9ssX4[Qեy *Ъ43V =ft4)G;mB0B~ݻ+x%c"5k4M2]0 ߘi=B]#N+)Z`ڠĊbG.^"2sGg$UUeSwB=u|Ν1 d֊AWO*; {|ҧRvV<]`cB84sϺo.ɳAЪ2KRڽ##.RN߉|D4TXnӻjC}2eߢ_ݻ":9X4jEghm{oHf[`8JґiX\i*:g` 8`LW1Rp>R>ݨp';xY\pOA vt\NF@y=u~8AHzhwM5?2 v}пfNryz Zhzt=kJtj`5. OP%ت*/ c$ׅ+BHi [f%Z%nI;B~_-+?BiF>yO\^}Ov8 UQcelGXA5%30y`,/+.k%Z RI,)5(9xYk.:EY ܠZi,hwG5isCE9x{q~My+Jgw7kӊI~>^YSe_^O/RXy7wv@ cagߠ2 ^vZ []ypR3e7vb, pM) GFoh7p[*!:Gvr<7ބ7P吐.I2%[ڍ)q -I#F yX/ݵvK^hv!!/\D)O.X?vʥZP#A*J9 7yEMAtFFjYA4%^f18 _ˍHXQ|yĪb^}?z&(qbC]{ 9eLak %la҃dq8%mn,9| , KQV)LTɔ`FȇֈomP"й5ͪ @H]<ɔcچh!PJ%v䮣fTy _/ #k:ޭmQ 1)/~2*CD'>C#٪#]M*1⤂JXUe ѰV/h dulا%8{{}W{OuzBg>)GX ;o>|~( ;x'd4cd rcGY8`%V0scot_o>nǑ*vFХN"O5!\]4UD)9\Mκ[OXjͧm9i&Pk߼zYgg Ҡs:*Q Ǵ?/Su ߻=Lzü:l5OLjWZY8/*{E y = +.G}Hލx**Z١ٲ1Bg(g£s<^9R=X8.G,Zi{6"V`hO2P u5oB9@Q.>9ass+ӣk c=́S>Ѕv 镺7ZEHWƞ<'JccN1#N73vaXg 9pD}H~z"h])$Llĸx7SߖtoID $oZ5+ٲ{CTTWsDEIRlb5-p)0`nw=CL# |/ 2ɼ E90 ._Fl!t!]g*|6ק)!`[ENQ%ِ/̂ݴUˣt 6 睗YX8K V:CՋkt:r'iǠ _ Glp89J85\ ʅBRN>Q4yw1awxJ dz2e|"J0f#ou+ѓ3^K)ᬚ4u3y?[KQIVg]4ՙdo kUt66!Ani]cejJ ҂22SºkU+]oGW}Y8{l._ iq$ʉq~%(Rp(aXⰧWGqwֻbɞ(? E={v’/8}=g,TY3no௷͇/ o~U+jц.`ry7^^]䃳ã7 D5ξ9x~ۃ4x~puƧ9o\]YiYnȺw/p?,w}ɹ U(;` cY߿;J:аlA5FIJF;J3#F^N|yu̔XRpYH xyVe ,.Tŭ;-K@u~%yh;4Q#%%Ƕ@6>%fXOU\hIWcUwg) pxgm3`0t<0jCpw!:$c,KX)$HN(4 UΨ6{-uفߕCYaJέ7bwlZnV+5A`ƨ p Q ZN-"nF553QKvB֠t{mv͘jɉUSOM\>,pAn3H$HPɗ¢(8pBKmԯN nUbGsJ1@9ŠT -}"0K=y@> M,&-4=3 [gC30Tq^hLJPDAR6za$[:GgJNR![M~ 3sL.E j ;1oFXiEꪈ(V%[D>Q)7Jr:xW0B`Xèk{ Bj8%4h-1Jb/XH*B&ke*D^Wa f_#A*@V%O)c.!ZQ썲 mXg>0̜|װ(HP쓲? ؐ7ʳ1)b 313qo~S;D.g;JigtĢ.hb` VdZCOv:& IaXVK5&x >X0}rU+%EϤtEXҋMN}l-L`C)$AR^)# 5@;͓-C % TbnlOJd'4GK沍ђ h~~wv/y8Nl=jbmA IdɅE!3 "% 9Dxc#]DDAVձBrPhZfOSoG|踎PHQ10V/nPneZȋ! ( 䋮9!=f\<+ FWQ96HyYJ`:$_O OϺGyPP|J~lFh.}3?O{LfZ|6 :6enDMk<7IeB:V`^~7Gɶ"Hnl=,YZ Qln$v%y% .L*+xT&-ָ6ʔ Zx//letf[:ɧ1 hqbG{M?Tֺ- V%!nGU-bGK!$BpOQi)Ԏt^Z֕\^ݺw,Z.tO6ahU&GP2-YV˾euʖ$" 9_sa7Ro̿i5Vo6yȄ ]D+SVe(AbN` xEIIրP9y!}Yڙ+:i[q7wZﴊiӶm3u)t7BSv&&"dPZS6C"`$% :3 b/ر#Iu6cNٖ8HţP>*NIx9TYH8̨̓^0j'v?m.6Q+weyq&Ԃ>߫F|&l jN#LZ$4(Gk} Ĉ<@}l5&#8F6EK mwܓݶOܷ -s~-›'v2T({AFfCf>nB0'lFZ׺3k];.u?2$ d84N7JVE>,]vzO؟j7>Tk[gj6v-4OMLQTC\_,tKZץʡ])N4n*weZ>mP0]6}ǂл4x7GonB=P1(X4QlѷAkU<>4a ?F/~({o[\lxd!߹DZFTk%A}Gvu*=w݆D{ƹ1x210µVlc{pƙm Mtͦ2[EF8ίw?7:buџ${Wz;Q$W[uݎB +#ћרCfDž%4ME)]\ͯ?7OsI뻮:ޟ}l$,;<[{ݸ y9 VB6:]~Ǧ 3kLEtVw[7+?<铕Zx[*.m V)w/ O8+&|M8Wzad/A? [Thf}"K%xy2 Zz,0W_ZhQ) 3G01BF(A~|=@#*:̿ 1#6x6NLRL`1;%K0(tP ^y cP*k]x̜d==gA/֩/i;^!lB:u2,N6O:3p$@rqE(2к,{z>B8u7gedqZ:/*T2D422HYJr RnTv I%P[+:D#;7cs9RbL4RER QWu *jw}cSqe!b0, +!5g+3m(WZR5 fR$O u¶Xjx!mZ+,*MgPzn&NdÀcz<:o䒝u*jx:vtL#[q { N!5]L]f[ RK_1Tw<[eI94D]ۨ^D7 x-)UQVA ]-{dDq < ={^Ab-ԑ٭V_W *͢ AENӯx]E2XKR>%kűʷeK:aURG? Eæ/JYV_#3 pW_PֶJA8*S Ŏwܯek=r$G-{S9ow-FosXĭ:ʼnN:AcArL pV: #*Tcr"TFBAB_*)sJ3xZ^rB NX(NWP=;$>%njHs̐XPQmj]D`%+{̻n] 9nX>ҁ|j@xF t#hKj;Q×_/a%Lxû`—} q;)tHlӓp;!wШRHG6ߩռu 祢7!~' Y‘z gm4Ir]do. cD{Yx_tGd^YKj@R,J$`:xcF$"3]nS#8ہtC5B|;5YNJl@$+ر @5 I c({s<6ScQ nitjʹ9 e$HzPνGϓ} )qD7骼߸5EEYм".\cQW|KY:_x}]p0^ƛx62DWhߴ96@76֛G"J/ gЁImT,0A(Dwcu9޳!:0XRdAyI1DEi3PF d:C;/6ol'`A!X4d-_$_ƃKʖUua5~,G;%_aU{iKҋe=ƍW1;B9 FJG#eYW/'X ]E#u'~~ꂅtܝQ]_iPެ ]0VC[uۅkXaޢ.RMݩG즓ͷ-="nӥXQ:zw`? H7aτgqә O9_J7m[8i,YT-s~ؗm ן oH0k Ӈ^+BR􏨴GB53ԙ] Abr4?S{ק*pɅ'w۫f?~)B?=Pqi[u㧳4J(6߫x/?DjAή\u AٓAI *,GS!NfKp }YZGʩB&+JRxoٚBa-=1D~y%<5R51tCk(y:Yk ?䵷0;tc9MT)1~J ~l`yѥ!6 e;*$ ( KJ ?zYev ~=M(XǸU; bB;_8?"׫=͂bYL"]K2o ToKv1f3zs5Pcyۏ71kmxb,1Ml]jNg_k<%~-I `vAPdr09:p(,1&[¤.8iQ ޤ\#G ,r\#&a[b}}] 6a6ags S \ja d*viAꔩ,Go4u!c9{y}}o;dLl ,5*V,4Ƴ#QwPB!4@p2y&c/c'6 \Qn ԉIJY~C R$e _VkcIx%7 >1dl NӚ¹42I] 50Vw' B 9(h8T QzkV1Pq>\lpnvˇe &e'&&}wI23l_(!|tb*SE`DҌB;le2'tbt`1Cg(6'GtrGS42\ I \Tgs]鵥Ipusxg3~vxmyUYf@:öᅱp7?/3rٗ8f< F=N6ca"1Fq 4P3?K舧XSۡKV(yCf PT-=cl"\$(Νix/ŲCW"C,` \1tԈʫ᷎1vºSޓr:cEhp&EzJL0xSvdݞ@/icwAkܹ{M=fqsS61DSrO1NSrARWl۔d>g&%҆@5PtYڣۉ8&o8v`̫c`\A)<KOZ~҇!I w$p & za \w(YI+m"/jm,K/(E3g-}Eb f6=2W)% +Z3EϓFzx{ {N%O01)i);tRzKV1ρV贍OAo t4mYs'}VJ`p!i(7(H3a$ul7DxO>˧T2ABZݲ0fFf;vj@s$WKZXhdl4)*u: ;CA; ͌&oqozmrk,f, a[Q >{BB%*`Bz)\#\m__ղڛ^;?U-kzQ4ir%/L1c_^I.y?,ie1JWZJ<Ŵn,2WV4{,$i\ۚ"m[fe06!cHvJk|HS>2]uӹ$Uç -EHT dRIum6ᥰ֡8 {K6=c]6h]F'iV$cm-a+n_}ilS-`P`i̍PSd^˩r^6.TY@֖R-a H[;z7טTE;:FL(A5 m5"-%W\=łyiI HV;W/%pO`Q0Tv@fJh`WgsEOEE)M_E BIzcڳٞJ]H_T+a>fY-Dd$$ I8iT# 3LiE@Ft^^$ˉ/di &QNN;G % aOx4h 2=KcاnE# 7<":f̔'X8%up ",UOӁ1rI$u]kLJRҼV'Έ^4Mm*8I[dy`EH~\-@KQk/hHA}jeM R7^&+v,J/+< h-1hoƆ"ЏBGdQG ң<^Hzi:-%\l#"u% zx=Js&]p'%}iQlxlg*0X΅ː#j_z_*J@O#!RVzɥjyh08ce+3L73Q;)8DQ VG H"Hj!ί|h)ɀ!\hzf@YbCqXY]M@{9v=9\E?`+=ݷ{ݝ{o{}w=ߝyC߻?h@{z߾aXƝ٧e_/n{_\Gh#_6?_BhOhi8>WR0@7o?3^W C@rqa^ L4lUjUx 2l{ sm#htѥ>`4<7rlboZLcyA"΋U眝:ŃObaǥ&Әr\>l<_td}S~0 |w`aK\CP?w!|D yk։WEj~kǛ_ヌ=?CZ]/vƿE,4m2LX`0t=@w`BMoFr@#x~;lmOd)dwwFl]N{y͸?`wȹ~O5Y4)W CݰU L 4g\<o:uDq%؉hM5}pz4~̴,iIɭhIYQKzǂ@fN?27zoQ"-:XHN5 %X+ʖUG{Y-S&S,$[~*B#zUJ/J[{JRIبCJA6HI y8ϤDM\A`:,V)O o:P#ž!!W`XKy/%i@7ar)͗;FJy)n"K3 [Mޠ< KI:kMjp<,+eJ}13(MQ-{ ЈB;]J S;a1J1uOCĮ51:SF:ZG/n8耮_wɵ~;;oG;f䆩c? a6ʏrcptsnIT.W~Rδ7f FþU0\?20Ɯ>f‰/=ܲ_ۯR4gz/o󶲯[RJǶ`=Vׂ5SVuUw,E"4o._قwZ5|ďc}a58Me֢DԘoNϿk3bI Q8DE, N Hk}{4gַZ^k%뇓+{!gv,/FYsfE;\z]CbucTB#j4J v#:mӪ9Kڛ5nr^˶D LmAPI6T"w"hj6!hsǤ! RePS&MLE eI S")0s`YcLR Ř+xi~hݻwOU$bS!R"jӜ,e)-K9mY1 4B7 -u\cnPF\IjD­liZV !)^ 6_Y)̾0\  ~R|#QL 2_շLVHv EP丹-*OBsZ(v}εc\:V"l:bOQMn3D`wXۺ*BPm#x\^$GwYmΚ[/LLKLTp 1zl&0ˣ*uЉWYi֚("c*Zc33TY`)(nݟY^ancNW,d1g X*oQ-r4ʹvZjmq@fY A =<(ڛ 19i(nҒ4@Ɯ<Sm$[SxL݅e"J<ayqCs`ofRN$wEz"$)6\֭p%]Ca\ղsU[@XP|jCe7jm %pM9Z;euve씕ޜ"NSEP|7eIz(j͔S+iGj[)  H) 0؈4`Zp iMw6 ˍW̅<>i ɫ.R_P'(;͝p b(#A 'O$XCV.U9G+Qui˼e^ש+_k[-ږym˼>\׊^,ZBׇLKU$4R. TmoFV.3 -UuIm**cKЊsƆX)93aM Z}`NE=SA)mBjXtZݧ}I'_ViuViuVy8gAB3MJTQfgBOwj@kYdSi2)NFP15k@Xb/bJSVܛ˫eƳ]e]U1]ZSI8EmNɓT =@BÎ)4/IԺPO$oǰ`9A )1,XI%1r 1KEiJV+[ ]X@H*8g-|%*"R;EaVId-돆g}1h 'HS/(Q1"T-,h\.gzJ_b/]e`g`dO[,ْLIIK^^ڎKdU>Uo7w:yJakZ)CRvX);V+e;\)[Oz1Fcj2UKbu3cNf_%nlPusB>ad-@g/'Oa>~r_S-=wѐjm,U(x~b5BQ6 ݮz8b0B=Ctؓp;Ndil ˍ.**K%e"MOpğE|&:9Ͽ)C9?tu* [9sz/{;$e;Yu*vVjkTEt=y_A|D^{xLJff}d]jǡiCԣzזU*0 :eAmq~V_Z("T䪪TA&_T3@Q`CR%PQTD)x tdߒSQfKTE=^\Eg/?|spܟn`k `v/AM}   w'?*?ȋ(;~.{A>%c:UvQvtޤO(Ԉ Am`!T9~G *_xZ/9W癜7}yx57gO0 Y9me\0e 9f@!ƮNQ HW`)) 6 C'!7-U%~@(UMiMiU3$P3m%ծg᪜_]]#^mqy[^QvB0o`#7+ Di`Z5~i\@v{Q=F}Cb ŶGO'g'/O09т6ٲ<>:=;XͰZ!5z,doٴ~A7(i|z!h*ٸ;OϊqT.*6 턚v 8Bzp)h!sVALX6B.ɓ7Cp DnpB{{L~(<.i!6kΆݧ>PEkMm]%!ԣ@2@)Z9v%b*TC*Ri-:kÏk9@I7)Z(.p#,nd6-7#dYG aiܑ`Ӹz2=T ׃!/1kH{17TH83[n LG ͰeR[!'LFFD0Kfa"ȭC`RUBF:eHqoʓKs{娞 .+=J>4MRD,P d"9,ۚ, qu}E{)؝ @mfiӎb\I!{I>[-X!PKMa>.sCAd_)ۥmqo F9ΡH0m쇥ʀ]79(BD7o4p[f[$>d5v.@&!*+D7qy޾AK$x\*~p9  v݌fZّ)Hh/@)?173tdʵmy޾A1M@n s iufT~ml}!| f_bDJ5PP 1J 'i0*o;(4CL@%\j?G)vŵ(|eH4{u=D %'4zݰ \A,˳%nߠR"2M=b_i|\D[k+a2+\ }\6[h3,Cro0 +@փ5ϭ{{mAP6,Y17aBCNN)io 3l-(QXv5!IJqoߠ6 =I1Om rNBAnDe Ydnqԭޣsk寔b*ҪٸcUM[m#3[j@,7kh(5rw;|ECraZW#tCsA78.fΌvKl[vwxtO'xgի󳥑`@-疯_6H߭~q*W?7otփvH]Nw&0e¤+1TDKYxlƸ,}mޯCsO *m,=/gor#Pw²\_!< [;[mVAsY?IwuIy[3{!.R}޽`um9oNN/G,7}vQ,Y^N4_/Gg3j{>3NS\;gPX&TN?jmB}p] VpӬOZ=٘ӝٵ͛?2&~ Du"3!p#gG*sj|N1{cOĶEV?)2'K.]%oo$xk{ݹݎf_oi--=GKXՌ[\rm^xZ6ۛjYvjYvݭvH'],m[WϮ\TC0f+-Pц"(ĐB;$f'_OVՓ|}USяKsN~ :ͫ c{](wtyIAG'FŰ-sY_ph2ȇ EK1\"W/ZK+X+)Ėw 3$NE% =QJV&A ̋ʚ5 m%dӋ\yP ;3o::?̸>Q2 DyYLbvjV$Ք؃ LLUA [ 6 I)eF+ (d29kVQ^ۗlY,H5MB9d?wĚQi4\G%:n%m})4ImKZNMT͕&yJ Մ-6ڍ-ŶєPXJkcu#yyUXќ 8ϼg"YB@UȝΝS\H5Zm_b,-aN">6;sW' hUWWNh,;ݭ.h1ƙ߼p,`\`}l_grŌ 6St7{Sx 2(Z@\:!@ h;Zv7cV􎳤͢R6~J]F(}WWZŖIJDJƑp8F\{xPdC4 H\!6>E:_`N0Ph$lq>К` hnTpPx]!2zB2sB3+iE#LbBs2eq_Mln}X )1(މ`f:e>+"S] <TBVkYaBϵhLJ?Į'n#dq e.EP$x.IFxL"e8fsL4:\@2Lm'`4SIke226D+}Pew/'6 bɕծBzF +  #!ðIAҦ_0e:sF1kL&Uf95ci q4&" D{\s1@P-Mmtw[ ^g0)~CWa؂ D˔Ce|P&\Xd\H8B{j ;V@2f *hp0UbHZpv.m xAGFeǰW*+s+Ls 32qe+y7% PlCM!зI2KMVr`M:es^@FMr=^'&ܰV#똒X4_0TY<0uS9f[LBUtibZUه~Yrl6Ks8@WOZWiU0Sk!3jey yz5nv:bX- * mk19o~P[ Wԕh]aMR BZg9VF ªZg &QŤRnko|֛Xm\eNb.kv q9It]$9. u䱛PƸWJz{ *U V~kZzŕMpJBυKĚgFYWul*2EEX+dXZ 0cQUYO)vSSZG-mv8N沥l"6*b zՠIi k7to&6*ʼn,%fXdmlڭ}̲ p2񪃄)p]uؤU9j`d 1Cjd\ZƍRoϥ: St:u J5XN,RrQ|q[9 ;aV g} ?Za흺Ŕ{g;cҀ!=VLZL+;gr4EA2lr?z*<⪕ຒ7u19l}M| ޝ٢JCBG1l:]VRcY' a&$FlfO"B2/Uc%Gy~ЅTc_ 8 d%Q2w3(Hݦ 5uA\PstVn0¼u?` 6 m~8k (0;\)=&?db)LM"8R O2yC9 oڼ((_Hy5QgeRzfal=8`LDpJ.U\яTvCqG sО^5s~ZiAi}%KY}R/cR(ib4Ǐ X *{ K3^pdk&$c#@0xd\>A_x@xhVXk:q/<%UxF.54?bo3{-f`| -,UT٠ɸ]/ A:v,=m(lU J,0~P4ӆ3&mmryʹU|k@o(g>Wp/A}JD|?X=OtN7+rl35Z90w5*ja~c&K pi~g~q?֔##Ooaz T~@>TJxztPy9Mr;4O[Ҝ\ MVBog>~X% N4XxPUd_.$tlpyC6㚜on*ʼn+&Rw]?g+J,s)}6mT-2u9x8 WC=μ{n>H+KAyU 5_k@R^9^rS_TǚfbZ ^mVԓ&rơoWt~SrORE0֒k`}KDְ!g7-껍:;汼-gܴF>^'fYp7/MX:0cZr\!gʚK1ATX]"Ac6X/nǽ4pO= ȝ'E`s LKb[b,P΁\|`~>)Mf |qIL&s3?epk׳1\c<}/i^>|X <ķ۰ W7$bLޭ?YUICe:tng=*Jo{V^2vk"U١x<{1 1B }u;*XWw;,ھk֋$T5o74|rB$^Txݨ/y_k,e< VWv)IŋI}g#0(+p<=@iWhv u.qewAEJ [AM Yl=}s_?OqrZJaqrbrsoLW~ T1Uߴ3V^ %m02#{N/'|l㦰6rB&>σvhʫ4 4s|:CTnwwtHh]s^z^H"bۛs8axddUɀ3#qn &j% 4K45?)O8}xwͅr>ԉ`kʮa{07F/bK/7WS#t h1 ah\[`\ZqK\&qv./΋6#+KԺC0iG=)kA|MI(FY"XJ22yQXo"EEW!3B5$\';JZ{3gȑ94mzm=;@9! /u#E%m6U)1$RqN 2b\ (%͵!NKo fa˾5LaȨ~uީIn\)=U̷g bk5(Y!@X@0…m^b@1mQjM'|U{e*^&͍3%<:^ysJHW:DИ} L-!'\dPOfؾLZ{S;}פ2BeB+! 9b3^8nF;ɼ4'u>cLn3O#8Z/TLPgbCBlpUY"*tZx-0YTahEPb"BQnfVM,a 鈭(0'= ? b2`c \<͉L-la箰`fLyp*فF4(eؼ>D8I`>`ߛ%@Pr$(7`G2y*3jA 9&:Q<7ei/%n^jA!؆w`z߉i ˇc@BVx }GB4(P~E 4,h*A7Q5Tmlf\>mVb8o`c n'Z5naUoi 1H=uw8, Ƌzs9쾒!ݨn$M`MAXF{|C$>5{0Bq kn\)Asc@RiYȾTǽ:UM]X'Ȟ&{N6 {C7s" b|=pKĸ mZfԈFz R%IkoSYη*v,\Inw'&7yylczMCN#NDA>u(MEt.@-DJj荾ͦ.n@1;VcU;õ-/0╿ǶlqtCW1nqnXJI Oߵiq;Qȗzh7xy&Jax<ȂRNo'}/a7"= 'ףϹN?H#E=''3/ xmьcoVqN-FPY0yY=euYgc6WIB{Px<ඛcFmȖeaQ)`Ȍ&.g@beLx&[“3:šOƱac!?e:{Zqоqb"PfP$"0Y n (!h֙L{HrC|7RDF0}59=%hޕxxgom#H6Y9g;)Wy.۹UR /#iGw@@H -M].k˷d<+D(vj' Ydhb9ŠԞe<{ځ@tB=t#;2Mah=XFmy|HؖD$11ZWxcnb\UD򚛀V5]BD%&%*UB$l$mgi "[O@bx>4RgueAhPλ *IMٜT2њ&6x'5E&'4G|09.R󢜼C?r)P\t+NsaP!umK6p- #za$9VaCAT!iGfTU-ٰ%JiD^hjO(_&A .C<5#Ϭ̦), ) B9 fN`cvUZXE۰AsnV[]C[ Z9PFM׽ 5z{73f5jdH֜يX6)8AZR~:iBwLIsN~MeG!hkM~MTЏ 5&D=zݢ5(|HQuwT\l;5QP9CNuK׾sSyTnóՅPhߓ WʳS}Ld2z>SdqZ^˟&SLjj&R֫n+`i[ &󸭕m`uBup1 :}_2Ɓ+e|GHCoX/&s:*U`j'06֗sQHwUVcoԈDb$HpEp8.1K 9L$*f݄wc-ET]Z+m[ǹhۖSXǫp!?Ţ,H*ADSb,^_j XkRCXnI-9U6,Z .wt`ܳ>_SF[/bްxz'Ix͡f^d:gr>VKUɢn+8vuE*uy,@(A]~#qiO$%-^~w/FCP|gtA6ZG>f%5h'DakXYtZԺmoqQR Jug6i:s̃c&5ɐve 0y /*@ϊ'@cнKPFSRW o!PrNqڽ ۷<+PLu`WQH<4)]VyTSmP|hYċ]ZjEl=&)ELݓ䢘 XuUbթK"vL)˱LeBQs 9Z,=)Qa%^yDW/*|᭲^#x*&8\4%Oa$Ov?3C?`]fSj=fˋO;l, dÏjFyo0xzx\&b4z"'!9IpTȇw7]40z7W$VޮUtkj.rl{i `2tI6<]7n{PD,iy!~~bZΘ&I͕/5PtԾsa=J-;K.qx*t)vU?f _߅dgwio-^7M.hM./a$<Q?NHb]7N700dVS^?2deU8ȹ^?S\2 (7,ނ]G6=k9w/YSzFl@kAwvCj!k:rV.=l&;/8}g:B35& snxW$ݼEwW؟:rMdO*܀q|J=RC"UI~X7K.w{SvhF^={qo/8uo0(ςVm:x|~B2AI\&s>K?A~?գ^fΜ\P(o\to~4ݰ?viogѳ1/G' vd}g\~ma8fX!B(q3-B צ~tTVy%]^Y| mNz|Cĉ7Q?dO.ǔWM˃B|r ^z{ p Nڟ8E1[H3@|8hsd3O ~ob" 9\A"~Td97&W/N\6mܡLR)rH;r6 $l a H+H?i H@[ l櫠a 5#A!bB壦O"*j b@"6?ǖ5 D" T ; Py& TnϤ[B:f |\X4TķbVYB!Bxc 2u'M$`I%81+i m$֡Gd0bKʵq9# \V{ o+ivzH5hhx yXJ$f&HH4XR*5 YeԸZOŃDsY~ʑ$QUyf9و]=݋>^G'-hGє3K/u鍳LшiXݡH`F"o~O1eۊjLXX5egw!R*c$B "ZpcWت@!0`SĊ*yS9{WtJ`ъh{ѫ9ڊTgc{;ʣ1>kUhХ:$C!xApbf;kUG!pJI#y`Ra$3s)a&ܐ:,_򆘓(ovc `+7 JU*AsdBtKh bQFJ)k*Q{@j9R:Dy?fŠ`0UiG*xI eer$Co: @r=ȯٻ<˘|OPLuoJnh"4cΉl3\ ${PZO$圯V9JuT3A][ ޮ05j GVHins xzI g>XxX]kqu$]FwK' d.aqHh:9x:LFcL@pWgO~ou3č<nr6fugS{_ldRyp^b>n'9f^%R,`.!l[6%ħp '=ψ}2XP8%H ,ɢ;LUnº-tP bEvUyr&GkY k8Az7~+4&#^L cCB'VmQGvwR*n 3ӷVeꟽxQ9x8gg.z=OK E"dg~(KJ9By m8FR>of m~^Az블o=m;[܀x%#d#"to$Ù>86 jtb郴iJoyvYZqwgioBJBfW"/5;*LM$KkP.{SJ.5=hEȊWz_AqpDmv]jY#a(\QOFE=zZVr`cw<P\Hh>R@ljR$R6;U YBA Z"V.ѥd+EJAMjϳwY.orZ}+Tr*6hIlu3 Gh!4q3[v 64Cڋ0Eٿk|1k b``0B"(6cM̶!w9͔,b Ag_Jrx&H `p1Y㫹͇,7Ĩ^D645#W?g_ p´ &[M{0t1ȫOI;}o/wnHtwÙLD5F H7LJH!>Bj8Ag l/Q:0K#FrV! U?1Jʥ/*\%C7uLȺ%2nnX`(?[z݁ZICɪ4\7\BThr MSyФ' J :of4EJkOuԼAKd\Al=Y"=Y})`$P&ZŒUF6_3D)Ct2Dv73DMa(B@=(1 j JkmԊT_^&'9u5P4\"a.}BhqXՍ6@<)/6x p$ںːaܷ6f W7ẇ7^FAK HzÆ/ىX1rNjf)KWTȢfRPn`SSvKSCi#LEI1۶T1'DH28:q #]sjezB @)E[bm۲E2%8~e OmS36.z>>С9ඞd@V;pʉ$#iRicۗCTuEh_.SXV&zz`!5m .P*S_ZȔ$y2%Ʉ2\ZfQU1alIɶ)R u__ԗ%=6,$e Al ~ ,-^m p|sfaʂyir%`r_o`W'/E'K 4KA؇وcӣپ}J4I~[a]}X#0KBRȌn|)zK׉y|uZ!~+bJSFi)IPf-X|1Aٍ;]},wasԔu,Cp,T} KQ:aw%Wf6BԆ=QѱY0 eMZw \hfZI浅E~]$ a4CT'1YʬVãg);A(Ng[ z!xtvy`/DsD%WxtqFt}#cv|gocvK~2_6$s }ۨ1`I08Ra4y}'0SW@_ZwEʝbJ#JIe&tpqr[u #cl2cnZZvZrC4=0s)WR#a2ԕ0'ƕzy='Q#RJc]=y3F65Oy}[zA߾?p"sb [v$̅JZ`S_ۓۏOI}q7!==d./ë.={Ut&'hMF}7\+D'+AR$Y6WqHå"XZyEfT !0 t|x ?7W `ֹտ*ԣci5ovsWUþK{^$Ybe‘*"YMhAW$ITU CJfcVia` C,PJNp&2=I$)֛(y䨘%ryQF+X Q {&V[ 8΢$VR "<3/1KpF2(&E ѺHAH?ɑtemk'RFH1D#A{)Z]c"*eO <1aU90CeMK) hdJoY'Վ Bn 1m,mbNp(i&L׽i w^'F> @,ޟ7&"oͧ^ 'KC8;KL+{wakWIN@Phԉ*\-9]B}Qtz:d{'F%aulܰLW8W WWf҃#:.FXvk՘\& f.(PHa [ jĥhA Hs*#Ue="vXIm2;,S&ǚN5*C ,Έ?qe~::Lqeϟ什 hƥegp)/d +4B&$Pa`2;ɘ -"Ƙ~ZĄ@ET:[%,͢](G oƕv3&gLP) I%&Dbl]\Lyb0vdl 02'4RrAiI0m"B?^޶/,pfgA?7RVVae1# #! i&QY%v`D^+ GYXw#Ƅl-;I!2dZ}݋Z 'KW'/./E3.6^r{b4~8+k.tbRs AwX83" Tqw;qʰ}t|섨LI4/tioؕ9jk0wZ;eMfYQ3nzs~XnLvcAy;B\mir|pt)hDKM,q{S@imu [Q;AsW2jEO=aRt/ח' VBzdpgQ ,mѕe(y!'vЃ,8`:'PB^Vzi񂢮b#/>J#Qlɥy[kpiIL8@XD[ /2U$.!+^ 908Su380ڴVc]Vڰ VİA{<-oR*q Q:%mFC.`Hp)قK4t;v6kef :~umP\myE73WV)IPZ*OPq G141rhǓv5sӾ`1y~Qi|qA_?CO?{Ǎ/0cHVSV ,`XixbaRϥ%r'oR$ 3fjrPH 퉒ˊDŽh5A;=.9bH+/3NBh\ *qB *mO ,ݽ w!ҥBd[ɱ4oGJ$`\(PisejXrWfZl vvrT"$I8Ir-YTPz3TNIda eD =VWP #{m4>~S9Rm$- =+bk&3 B37((Vk~Y@VtKrG y-t~r钒ΟNXoOoi9xܩq,L]&YQ< _ڸិM0^/6w}ֻco-iZxn~6_WMZ Jtq5_p:yagWV% ?/c`˩]-lyj zD@9YY-:^e4u=\\`Qe#oK z*oؒQSWMYKs)Yz:*t25&<$j=)佯O˫7ުqt-!CF(>߯oxzY`5Ѽߜ+=Zxn_ wTHjy ,:R$.fǠnʹᮯ<}V(QMNQ[Z =Нe%u-yBMas[t`-RjK3;c\o| Y_Њ'2n>LGHn|֟l_ϵ_ln`O[\U?UbӝWZzɊ.q]u,^]WCuu=t]ھO,~9\"ǯC>_曋Ǣ?n7y돟ܗtMu @SqlM-)a4r"l459ti*.7#'J-+R$(MD'tV@!l;\@(`Mf8[Z WLt%Ӄx܌0&5|j,{ NCEovb 뺆N [UɇiAѪ!hRU+͠;b71TW#  ǫb'&q*bg{Eq1v7к8Yˎf~GyQ~R4XֵhTӼŋi8M$JPH.{o#Q I$bwwF1]CH&?_`[r!w$q9e?!ЫCY.EƴՈ{nٻ6sNi !t7QiߐY×1֊K\6o2SVɱaۢW=[]I2S/c =^Zs^Fͪ(4J{t[- Ѣ-kOɊQ"<*`la%N|. JȎc}*ï0i̙bbJ✥y`I(Jq޹zWL &yv_N@MN $2yqi1 .F(oQ0bN4yn ft%+^R\'JdANR]?M޾-Jͯm\7;wK<==8J%Q%sHP Gd_Q iQxːu[2֍xpN:2= xZ-b2 D֣\Hg}dP{w|qM^~PI ;i#S`,i%& IC;@e(3(i@%x"ᳺWe֭k8o-bQA$mg5*gyj%m"ÐV~kb DyBwD50zApsP> CfL4nx2|4[;ֽԠV8Kg'..S8% p=h: l q93k&:8ke܋ ^`MFÐ0-=PH*s :RXgUIePʄ,܂o-4˫ņ F 8J:):>Ckt\(a#/uqQ`a &Mn1!oVX+JY&$drda0UU+ld *m 6IȁZkvH:SVi3q(S,継aCL3 ">96Z)1WEY?38c97nLӃx܌eZpJ/Ri8ʺtaH4Ojm;M=f5c(Dk5-:k^"(~ca`XGy]zܾwKD[2EjL\1"9?2M!.M #tP]@f<%?UqTh7 R[kib}M{vf@̃μ2G:OKJ],ODP`H`ɌI&!d8]Ð7a^ C[ۻLrayԆp<+8Vi'tKfŠKC#!xpES\r5j?\ ows+m@Pa\/o~)oe=H?N~/~GSKxˎKkBEFX4ruoG?$z{Wzpw^f*{X2֒ؗ?0Be-I-/fB$0( d J)sZMdx#͊\6|-4k6MAȤI>=ps7Ct湕B}3_龮辮辮辮]? 3WJW;L&M$IwTIu+yRdg~Lۏ] 7YEBmC = BDҾs R8oTVpLS EVe&D$kĈ F:+9\o (PR.[(jΎg=vDW 1{(dNjvOP?sQ]';B)`t>巂Ct³eyԀ+6ĊOx^ޣ/> 2hUSlS#hL~cC<']݇g3Yi$#GioDS8D 9oNҵӻF kT>=b RlAjT>хFe$O :X"DqJoa|^ZMt|);1xoWm ,-ۏCnÝK\e 4خŁ{4>n[*Iuw7E~SWJBR|Kʖ`%)w][or+^wSO$', ^mʒV)4-MOUXdեX }1(ybc4![hUmg+%_pA ]MRgVzuOPrrb=U7mZtꛏYcsG2q-I6t+zU>l!9RJ !H[6Tũx$;ׇWǪt5% *Г҅(:v<|.sϏ1T*dKTJs^ -Q`۬\/_+"%ts~@9 ܏w{L:!  Fx]K1 ] ҫwS+|K^>j:vl!yuWcK?ǔ ~6^XeWq 0slMcثhNu9`:sy7:ׇq aȑ!?隭+r"8.ˉ!R/p2/t?tػ[SLoDùctUX%qgY"[E>Kd:ʃёƝR)JŨ@ 4RYYBw U3B\\G TqT(p#1j8\*d#cRHGbtOvI##+@ފFȃDc)QIg@4Dֻ\6 `CEM+4@9d5D=@~Xx VJ=xΪ@.[犰{^5+/4޹A֖PbTU 4z!r!⚲aMճ\G?!D06'$|sUZ1j5Rp]Kuv0y149!ݯoHOLʤ&H1Lq2* U0%ɨ d댞S/zFn^ݴ L3DI-)NJS,Iq2A&tkwӽƦkfF<"^W{Zi-6lxjUGw6ZZ ֨v֨ Aa܌+PAVPCa/rRv w2\ϦlSg׸"/V<9?/}8KT{KcQW#>O 2kW_qcͧA-~VS57q7Ꮟ 3҉xwJvzmtʛGzmlk:INQk$h#)cEl|ؿ{\95+3fH1H6YV_=Zd7DTN{ (>Fx03{n尿FznPPaV:ak_%)hf,f 0!B@,D~BB{ \ t"Dcﺨ_^_>4B ?o„0L%LI\G&Qg I3gzG8[ Q J@5 jZx(&Fg4:``izy^^{%2VI4@#+hWvt9BS 쨜N>] ̥/(Wl\~w6/7\VLe9VH-+,W1&ki*)4Hѐ2cV;A%tҏF9Qc~9??)$g$S &}QPe?%N2׃#! l3T2Yܻ ? n-c-jN&ϗ i_ Ճ/˂)aف^@D]BkSŀriȗK;o62y!<↌[]r˗d%X%*$&$32[p;Q,$e&2*H2hA&+\~힬Zbr(H]1ؕ]m ۽&[o ;2ɗaEaDK Q{d#(ҥ-_̊-_V[0bfM45P:o3}G=K>d#R'\}_GdQ8pkA=.4iYNZ):I.oWVI5%J:@@/mq"Y@u5<[Ƚxp8Y:p@n[FɃW3Zi>IM`즆fZJ5Ĺ'@5U5 O,rlcWQ c7`=r6,dD' JN<Gf#)'}ZS"(Z?4PR- ?ƈmqSm޲п!ggx E9+Y|ΊsWt` KJRFP@Q j#<543r(EJU7j[5!&W1\QO<{lLCMR-8r"ղl'[Z=\j9'cTr*gNXzfBpԌat2|_%Ae^RZƒH1e 1'IyM(G5`?՛vǮcriW~VT~h ]IA;B83xo+*;xIcsu ! ׎'ky* i3 Cڇw.2%XT yS,E# "Eb!U4OvlGH#*qrFk!U@g \*|@Aj}e$-j7+ej}lDq-TB3^ 0LTtywrj!vQ m -;-yÖ8Q(et%V]pdluB?v:~=TN^3A>V]λ=abҹB |Z#oO KS܍Po4ȔTt)h $RjAH3*T mB&$YUeE;!D'A`t 8x)!:гᠧѨ]ITqG,R:#np*FS1B!&h+sbH E;u$n5L34Uíʥf)&4"{+8K!`k =\ oݖ{B8eWϻ+z[lT*IGu>)hT1'%|@.+=Ow"jIa8n"^KN? }x * {ŠJ(JpeT>'!tj"2KDx%&H)I Ύ J&Ew6b"*gbECD+#2{AidZ.ǽR[5,x8YS4md\C+!|p&MȁFhbjN5 yπP'pL\Ry&<#b"aV*DFdٵ{A )'UQ+@L@%jCp4gh04Q%b]NHHADpHT~(x'VFPP93&`k0.r>~N5ARҵɶ8#]zhG;B@01v^+kE~rFO.zx[~b] O}9!dUGqYRpM-2pr~N( Hu͢~9/:=*03&8RgZܸ"̗c2ػ8b;$mO<4c'W4F~Hc%NlZMVbtEc ŹٶǭMB 1!nF7wۻٛOW(H_h0iG1>1w]>91n}usp7or,w'qq9{`@]pcs|bܩCcGSт3 Ho(wE2"s4kX@"p$7Bt\}S S BR~fOJ>G»u\Hm Kc_skL,e#O)hu(n.3 ZJ|)Ce+-:OIβN$ f`Rj9x׾hq˶35Kpnn>%i;b:GJB z(X,`n.]HʰJw&GCD8 וBP>yw&-0Q4 -c| hG8u1tZȞNc$Au4ILJ;8')dZEv ;2tvֺD0NILחa Ob5!ES,S*eCш tSJo+R *]Z"U%B]qa͘ dvE-K:hRr, s \Re*ʙ+"TPԸ,zoR@e (et)&DZUVƦ֔gjs]酭qU?׈_l,^;i2DSOS +~c@F/R)ϙf*|p8҃@!?brq]M+O'hGa qjV|x=]oVJePAٷy:[֮>~al8K/{ksrs3+=kM'+w:o?yt NZFo*gArq9ff3Op/"^ -c<>wqvLnً pN{r|0#@7'lFzq!A(3fC'm)4`]y E(t0/J'KY:TdZAkCŬ:- 9 =GF@؀izTjIQw[th'9d5>4߳fȯ pw)PفU~+-59ӻDsjYCqX 2,)oڴUzg-'wV=SNgV0bؽH2I!nEhEp'm&TNZE@yЂ*%hJ: TD-e5%\ISI\3#ޟ+Ps(Rn-"Uadбvu%He ]\)#';Ϻ\B`H_'#4:>+g%ۦ4A`vrЭ3[{zc6Dhbɘ9Ye'k9mAhosP!Xe|jlvt|R(#w,$[nf6ޓKD)$JJj_G !Tr+RKa E%v0%u=0hÅj8/d_JrTn:ܚν'kW<~n20pν'CQ1Mqs$P;ۢQ $AwX5k^1 Z0 Y3I%\q%\q=dT7Z+RAS*UTU{U)K/F|MɟG&yAr!L@Gf>#AI,棡09} ]h.y~VAyQ.I9'}9S qZҚvx];;;;zl]RhhY mPVKJ&\֔A[+ɺ0w,];uh!d 7nc$a$Wx50Wx50_ 7MyebqJ I]!)Ɣt$]3n`6M6)9IMq*Cq4pKeJ@*'EY*d%֛4dy,56XuwRYebJq&J (`:Q6Xo"2Δ?14q d00V8ޖ]dzږƂ?i/{,w >ߘ?~zx6=~{YE;_.bKЫ[|0U^ԋ?_ ud:[nY X.FNg[v0K͒a?]_qPnۣ7s2BhsmJ7T 0%HQȰ#R=0nI-32[z7޸qƍ޸eƥ5D)vkB@4T!X6juF}ɖY@YYk ցv2Ǯ=QU(H%4lQ4c)JgUѶp&hV xjۻ26cLI +pP`O +_*W /#' .=o^†_}^| g-^>"!lClqӽy!ȕVU6TreXdnGh mὝ}OK ,tmzNdDdy5΃hB ] ћС) A%`*ڀ*1ebciuĚb91%QZdrv@љz45ohљ/Quh!zԦ:D%wht;@u"4ym_5bԵP1j\F>8|Ǔla.~~皮yeSCs4wMSZIn{6PsAi´%I4FS{!LXB!CѲێ-ێӽh붣 UK ni=CY-u 1tm˒dCtOt2{ߞ7OcD/{)EmG)^EZkC3Ze.vl2L㿮F:iGvfm3*;l'PP T<ζ3)* 4#s ۮՐ lrT{2!LTCkiFFuN9q<蔯x%!Xf/&=![8hG}$MeV'!!,?DYwc,Q2)sj 7;e.#ey5ޯS@l.j7ǏuNYLfުNcHկԐ9-ؘ>&9Sr({sn]ms!H]_, 6fp)!HqyEd[9ԖYFWmm-ngE\b,'M'Z![{7uD<2^P&'+dG)(oή"Mo`,8UZ4b6+~N/>^_ف.ڄaQA z& N-հ*uHuvIx?"eD\ QîBT Hj;HB4) vfշ;ޗ4ܗ Nk=uĸ޳#Wٛۓ"Yd1@>,`[`2eF?؉~%qWd-l)GAbכUbU$wXƐRXMMr8* -mTU"nُS plO] jl9/;-wJ-}>. 9y}rcfҼEC30ΐS*vIayWwc`|?F~c ߰욂& jr4MJjc\yL§%&ټLI)KTAL>+RmS+aFA$:M%) 6 4)TT ! s㣭Cw^۸/E=R]Uߡ!rӽ3nE(&Ǥs]L,?;/oE"4f ']MzTp;Oֹ {kzah)ztm+&ȑthcK=O~X%\ kfVs ?M+-^!Zh;I#*;_}bqp-c5+M5pʩ3ڦ O<5"KvG;" +ʓc^w1EgcAbRC` YWQs`Qw&4 ^IҟHbhi޶C(lFxxQI{.y) :JgUR[HJpVX"fdUen 9ԏ"#YeMN0MEjHuqiڱ !S\ Bluc>dd˙p7'!o'$I+_xRGCiLXS]Yz &-s5 t=w;u|'rZ6e`fDsVm+:V.jn1rDIG75Se6=R@2 Txsz0O`-J'{w=q*D-؍|4B?w'?k^6߀Tщ@s|QfibT2jHx18KMVⷋ*m1żnԢ3aT5PbW%ڃf*a:Ifm+o-}4{EGh7. ڊ+RbupƊJ9mirT3@:>9n D :"A̘ _mԁ m` G!p֚^D2L_#ZZ @ vJP_mE&Jq)hSpRH'K}.>ͼΌ*SS-4M@;*}7t;v -YAyW-}yW8h6-M12i*e]̺fz[2*rG@ !=k[xZq~Y /ߦͫTBɂj0eӂ-s)X40]-Y1EQF)}b;SźxEiʎwG~YEmNQR3je:v:y e'^Ex JvD){/Ax;Υ}VVJw/JD :,EatO֎S#@&ҩ\1MQ4\!Dh*6,dWlG1@d3`SD)w*Ha{O3t q魄mjh/Vhw_ EI=Pz<:)djM)CS%7mBr%s@n]~}^ۣ+u<ǎQc@Å&(eV-`_"~dq4QB)kQCH"=Lnә')ȹAӆ߇#i~$F(s(f ]J&q) Z_{n0 ZT1k˒,AV-=T}+5}-&K~^ 2i _0?NC]i: uY庬 4q$eSYiyfS`<:ᓜr2!d iPfR?>-rZvhϤ^O>,[CN' /r'_z闳ŧY˰3Fa/u rv;1mbx,0UiJCRmK3<;PFWz;o2aIsC: DjTv%dưR3LY4ZMH@ yynYYR rR]׹ hezd(XcP(T> N) 214NLD `-˾Z+#^Vluwdbs]^LBqû8HNB$'aƁAZAFѱ;budz4wi3M| _7hCڣ!.3n]J9f񑚤}jS2"Ov viiдӠiAN˚V>*"D*ȼ;Y&PZ,P$SJ%),§6ѧ]M_贫m+Y<\Ge1H!9iJЭc$UuqE+cEA9{Y%RցA)(, -RX]J P#j:{GDZ:%MQtt,SeC?e ^q|y@{Yh*@L$fZ L =˻KCiD?Ō~w,FP3pȦiAY,E \UVMJ^k+ e%靃k\nkגYigM0&Y\{B +f #`@QBزZF~Gj?lHI1Aēl@4*j?jCyF,, VϺO4A>0XOItXuhզHa< RkeNJ1E#bD6^N`a-E?`lyWR:H=n,8v*2$v 60IU#, Ԡ[\f 6 oYA kw IP)KeMѝ/$QRg BrLrd ?< vxm^Cy#AdG8ܚ e,B*&g[(9ufF0iӬ!Is ZK[\*-P[#\J!S"ͤ/HRB)pl؄tp7* fU{oKcxU*J-i` 3Eawxv P(^Q1F{}h0zk=%f^J'm=DE/L >˞-vk%jH_!3C9P-o+Irl!cՑ7HuUX8*umM݊5 Sjh n$EiZFOۑI`vJAw-vş;+5Ŏ7@mM!Y`٫#v AV:Y[[3N/w.0ri9%~ {*Π%~] onu`%0U"5FR]'oբG)~)~8w$@/,(d 9gWul/ߜԃ>z.l.}keRgٲGqPT\%{&daƇl*oD8ԇ|ӓ:ĐzmC6IC;҄;Z1ΆH| (>vu9_|PK@(HUhnŚfwu][]*nlWSx <CoX; O~d9L:Х+K֡~/sXVW^ÛwS+[p٢$Pfs"=doo<˷[~1bqի)]Y&kOVLs= Yy|bRL/"]fCHW_偧wQ4,q$RjaZA}FH#:L;@Wyˋ/"cY& UѶ(nm%M tFOlXnbK1&1V)՛8xq;1f »H~7'F绍PLpOn]}/ށ`2U~]?\lşjUYS<폜9gG^N}c t*@.ușa8xiS֕ԕTC`/H ޏoO5|ԼKSլ^܈lq _<$NMyo<[+ oh5/lvM8a?rg%[w uX,Mm& \mvH:<:(Htl=kqD8ܘ.ԧڲqvٻFncWXzI%pJOu*xk>~K5`u;$k'uiPH17j(i4h|ѷlˌeLr cwlR ry|mېXCvob{-Q!h?.g7ncT߭6+#V&W̺ 3+~zÖidYr5j~]P6U]^˶6nn3b(dogJ{CuXvR`q箷)-hqװݞ]Zu-WKݖ2GĄ27Q{O. ш~ޥW2:sGǥ?:.q鏎Qo?gDf0*&3d40a]ne95SXqKa'`t8afwW_07yб't0i`tM.&?zr𭿱nC٭kIv osT weS~<=zj~ Sa1?3۳k:O?—rqlN_XA1zc8Xe+e"xo>8[烗 ZnǍ0&%7'0j}ܼڑߦBqHvI,(]{p'-{"$d0ƺ&Y;5ɂZ, rl谕.J4U%B=UV D ✽X6\^\8p #Q(S F?_zpqdH]~mwՎu2 )-t&VFL9sMV"g93qV?/Q"n3,%LN;Nje?c'Ø8}B8s[VҘB9((iTQ ҹ VLKHCOhaAA:&+9MG6"pUBlVRL4z*~LFޭ8;S$At"Z1~?Ib`#`L,0ţ$ׁń$krϸvl [5媿V[vNlH~`O*ڝ?~4BqZѣ'5^O=zq)UpOC cr؈A(uP[۵;-ѕYr ;V>̠|9/S _ưGs"8*A{8(+!vKtOZtD2~c1JI *)SMMS(E2Krjvԇ:K+vgz:\ Mͥz<ڡA<rEwjru NC]G@Dt Nem {_AV76p*GUy[ ՓؿA ѵ}@ŷ3uh<7d Qm$:Wh<>YCma8M/oH;M?|\rDzm5xZB2lbV5e^EcѤ3?;]\Ϭ[;5nt)|5\o ~ko 7nʙY)Y(*2Z)[f!2 %r |>׍'}S߷#ߊt>T?0@+_>_5Ve9d:[~k4ih>e-%O_Ve]cp?.?ŸzpS{#g'g_̜>u$aPⵞ|Z|n;+-C4y8y}JT&$rO~ϧVWi!pqŐf,H!INRɕCimf},)#rk94siND`R83@fS!WZ]꩚Xvipcnyfwp,jZ==*s堆%"K.HI@R{n_8l3)zBPsA͂!<- t^Wl~J-oVp}|=hnH lXO1v 8g1rR)?ǽ/UP1Bk40-?j UdԁX%9=3QlwU0"Rdi[ '~`?j_d;XfY[(`K `g2Q[OMHȱ_ꙑ&8֓ +Ξ$TmRD(jb`0EH?qdA%RWpR00S'ɩ&Ig^8 ! >8aC"CŇRڿqq=cO&׋_oQՏ~뮯 O{^&m"5 OoysukF 7Dq\Fơ$ӊg>_.hrڛ3eNz:~VcB=5D)$f.gV)^jBj2NpEk[x cĵ1^)0=XZmm.+A4A$@Oo)YyGz[:h<<^Ч7*n`Lyo;U4tGv1j72SVN1';6 ?&֡)ifJ!U9 )%aqN*Aq3(\Pn2F~GAj_LƵ)2;4Vcΐ #gH35<My+ۓoO)RFO"BF)!Hk>UHKdq"VqKJ? 5 ,+)xYn?zTwKҹWW*YW4G0|tΥ 5Ipqs%x%}|Qt{b,tF0} [&JR`KⲉY-VLIeztH&].P1tvp(lsRJ|SO~R+S8¿<0g*984>{X?`W^.0:%X"P񹒒Z$ /Lw3ׂ_| PLlanIh ؉*UZ@~ >&~z, x% BH؛X 40擢S%R,/bBRRLXt{b[G]/`5 r$ >\YÔMiOc0kCU0nՃ8ď8Cے8ȲOd<5*rT`rXDkRT0}$:ݫZ)8D%ӭ9{8{ ֜ɶ=ˑP䫮-x6jUۻeeNf٥LNb4j^O8%r mB1qɪ$ʳo#[#\3&0, D[ ,BD5.(JB[Wv){N.e K]uo/)h^bWuRQؼtAR%)n% Vuowoҽ`$ViO@ډ9~7ʽ'zDJN <[ xxf7y7_mUHыY&n[X_~(]KHu8/L7:[{WO|R~O^xG@U0:>gݞ_=m:GRJF%ľbHVHX1VZN=DCJFy-\&d~LvOs(;ulN"2llcWoh\)`|(>+d zWǟDa+->Ϳ]!@71bzEZ{*R= w;VAFTdQ+:,vTuGs qrpwΰM S|La6f|l)¨z&#g:ۭPH=7Do^M(8p Ka @G+ hز@%=M{œ+=<:pd$$ӽdAB|{;9Z=O%HӕWs놼/}^gE]Pu-.TisVzcS-84#GS ^#uC_X>TZ P:mzaJ5S鑆fx:7%7ΌLBQт(M Bf6sQ1U(0ϰtc$ǔ%l3Nld+֊t+:[Adj/ P2iq'sG-Ա"7jxN GȭEt'Ж 9KP LZbxHjEK4ThK荡T** ٙ]euaxHdP(J1AL[?h͔i- To&,,U @#.kJz*Qɕ[XYhIBqM)LgY8+ܓsy?T)I7׮8N>}H r= YfY@v~ny`cM 1Xj0yM!Oh練o9k!P!_9Fwh$cCC`7[Ps:YRs߾zu@% mz^W]F(M7r_fpWru; $%O1'.>M}d|{;t:5wcQ emsQvwaRC"$Q[]Fٍ:0^F1Q$XY'C C)U GцyZv7t~ ):6n7Gp-v?O>4徏*q ft{[;c@x׏VgI"S􆼱yd:z}EUՑg\Qzy!Y;2H{Sa.*ߕy{Re=݁WnޒǑFlӴ|@߿}As4=+f!CDzy F@SO!_9FG_nT1z -!Fv T{}KY ҭ Y4F-pewdb^9נZވAȡ]8i$Lo׳68+gy NDZ0:!9f{/c;M6ZAs6yrkLx֢1)(cٯ 9`9\3/ƲGm-b ggL#ogUEW6#}Ҡ@V<ݙPL6D#4ٰƢMo_oH٘HҶmRiv?Ar8ݑYQd΅P r0jXu&2BLfJZM $hrPV;R'L*hl$b Jm=t`:D*e ZU- ~h&BI"űʥQ.QqG(t6it2A|8Fw.(aؗ8sw~>&SGذɩAe;wf4l9 ݉SMN5Vb%ۓJ;;jM- m`7|Ʉ SEFΜYbAtW9_I@IY rSQ o_6U_F:)yqV${FX ׊MR!Ž%)O-Sij~۾/0r"^W1uN; ?Y~3֗6M};\CF+A J S@HkisV0IJu,7U;1LpMA'?C@PgXj1i9YdFLQ! ȲbRm #A:K?FOSVZD%9N)` r>͌ 7nji41A8BLcB Ìd VT @g8lLQݸ ,"@#Qo'r+EJgZ-8X)/PJ( ,%1jؘzkPH.Fp}qC(ξO[?>u)C>v?A4R#.ExG<Kz``7χi!5fG4=TW>smg.70Vj#,>|Gjΐu4&PQsyDcCjLju*qTb| ,M'(7K iln%vpoAhvwq™"pCn>9eKdahIҀyaE8|bexWuX vh|rj@q!ӟ8@N@gC3m4sO( ,iN媻R|(@(ߛQRYB5j=,B0/>K&QӾk֊p1to0d# mˁˁˁˁϗ~RJq*%V*R,s b,'ڸX2RjkS:8.vlWC2S7ü )^{$TXeSN%cKzSHTQDQ7WLRlU+2צKE͙I=ݘt4¡ $B?gv@`~ckC8UॢFLc;\>#Sٽ-aaW.Лc|~~V,ofU% LajĀ:{? ߛY׋>+%%\L2fpjԀ=hX-x*z\V &'Py I{!JP*<>3,0G ~LH0)5=rD9F}SUwq'~9p\S],p[}5w{Sf ˵M2Nx"mH`/>T#?Zr:?/Y^/kbi^jzW=Qqrr koCb[lbUara!T#a޻V3 {1S{[}LHLڨ @'(}D8vs\\tOϰ2ÝͼߕN!HIY?]풬拕K?#ۖC-Ɠ5::+rwإ\0] |hpCQU9^6,mi]o{sW|yz A}^8{ϾsB҅#s9? 0E7OVͯ8XD(?_U2bר wr/fZtv42hJZXu M?zBjyO.x]ZO=1\~mf)YKhOQtzz] ϧr@։]t;hMܷt/ nmpW΢QSiuDpAx`EW}"i2|p5 oz"2*@ԫ^2VV;Udoܗj[Ω _бܚۧjU^}n4t:-SP}pOoV" ;Qb6QCp?4YpQDDic 3<OY URnxDBT`&NBR/?ЎL>woQ>^l)v$UyL>V+%%5_\^4 g(9czw1њw*y f*ԜJ0gmɦӒ\–MȜ$6K0KSuE1vy0!Kx̓>МjP $I>K21,&nBhagH=.)c u8*T̻KnjJ ߅l"ډ&uk{ ʙ;MIm{"YF5TJ!/M8amߵ }ާoz-OCs-Z.֜)N3HbD*ȅTY5eiZ+}kL9yN*/lZFNa@fD 3JijY2j5##BkLJ^c7ѻ1MX}w޹:Aގ.X &(Y$x A4>'vǺ*0AlI63/s&y$L& iP$q,~Mer3h)TytX."66 /˛Odz$Od,娭:KI1vr+'>\, kO[6d\H$gn¯LN\$)Iq"P 9c⊱ #dAKEKRD"xc`kq"Ww$Xz"[ĶRʣ5-XV(aE!&R" +kȃC=Jp 4bTvm÷Om:7PAՆ mX EG#}pFi)о"gqֈj֖=p]T߶K Qv<,Qa2| IŞ-DMpI:j/+SW1T*zO`ewZ<̌IqR]ǽ_dV~:'>sS>'>sSh2`BpG(ȉ͵IqO R8B9A9]ʵ/bڗfeV++.im4s3G.$WW!)|Zn2'pW{!yJ_v@~mr0n Wߝ?1tj}1?5ԸcȌ C7s}`iH_ŧ8t#DXHQ _9rޮ7-$SJxD(,'"wq't )b"%iuE@O :IJ Q9$p_ㅮ}y|wg@5Wԯxc Cd@ uK%8j[8BمZnyVƺ< qsY笔3L2gL !M<2S"yJyAu`.(SҺLsx!BJ"|2@ n3lpP 8OC %ɻjgji$yCCd^2N"92`#]vV[:e3e Җi )% )*.rF&,]:u(.EC jӮ+. 0BueTnH*8 *D7[S_-u񷷌>vUGjy47$H5z/hE }eb(h~{ؾ0aL,ȍk[B&vV[0ϦNp^ppc@sDҴ`@"LfNEY*j Gy3S9Ilxf-*t4e-m@g8_UVP`.'N)CkURKc Z $[cnaaʬ$khAXY;6fmNW/&x BF1"$%yTPm]e<[޷k<e <]4fj=[rJhe[o jk\ /%EmKF }3w%& ƣ[Cc nΐƕ0_yL1{ytd-1%ѐi"KK >aY6ln?\^ y¸K a=r!Rm/FA |?9Ur-gӶD*M`$RDCV05/ 3aL\4k%';~> .&ڠ) ob#2iQ0$g) S11q3 NPIfkE{΢;NK)J!RT` 9Sf xDa0bw>r"XŐ"70l6,'݊wnk+7T H`Ec  q{q>_bFqxh, .I\4WM9-M QGal#B7Ӈ}Vb6#*RYԋ 4` Exb-_"&M QNuNzvN l?xNP 0!И$j+$1&Dj ZIHB2=.7n0.D!6K};1`.D+roݶn)w7vOM4-m&6l7.YrM_P$*Ti=@Q!̼x^0t^!܂(O [aB xcAB404uaA}'TFMOMOmStJt!^{naK79] $ &sFT kV`7 .ӦU8k7csY9vBZ%W{3J){|;@ñ~W"!',E4\eIåiBi &ԁ:'J*| MGӄʩ%>Se96:tC&𣁣@uuWF0y8%*:A6GZZjwc-ׁ(ބ2M9` 'B̽5\x!H ah!xJ~h*,Lv*adXP1kj%6Ēds -4f% $ ͓eVjA[54T`>-P-PiPfse $|Wxj!7FZiW׷粄SK.4z 8b9/|qY.nmǟ"U5V16|~Q}_Ծcd!nW(N猐p|/ WiN5O஛zn=K9>Hh2Y?Oct&Am) 'l45DTfT2¾,'SxV̓ɌNQP %D:1+eѮ݃ck rixbGWӾ^f /Eso%3 H(ګ)Z<=91wOck@äK+@- (vI\?ևl|wԦf[}n `BY@V&&0̕B7 I=rU1nAu0[+] ~+IV`Eo[m{:~{քY!OP&wݡWW {qʐ6z^+jSۺtu%3o}Fz^bԅjRG˻ ҐjFTh\[>N)U;rz|Do*;͔zJ?p-NmDv ?HtRA?)5Zru1)WpgGgB(&qD12rK J)O_pf!kXUd.:Uc(n㶑BXR႐?{4M<񈚇Ӊf2jQ J:GR񎦊iy'GmA iPN1ܕ%٪`'o{ {t\0,~}k~"Vqn>S90OlS39$D9^i%5ky^'.aS_h_u7M5WK1:$Wy+,)U%P0ږ V:F]KYkI^R:VɠRCH=% ѯ`P5{hu=[]h5acU2QFqށ&8y֬u-pJ *)hΙmrje4fhOR`NےnꓐnN;H#j.0&n` g<%8gjƬU[ԭC4{ Ww=RvH$s5*:l#өzvxbu*XW;mDבޏLY x^x5RNueF;'yeZVj.A]J3`?{ʀ/ZMn|>WJvb~ipr3.sE?J/~Gc. .G5;&|~scV:f[=.8b!gr2!6~q|bZuk1l)RcE_vke}XSĖS,u= -g'aHćdY㽢PKPkaVO n*6>{;} ]>xs%S eVA#mpJ53X&:}W;rW_fcdk/a/K/a+"`ŗ9&^Z|C,^-F'x_7DJ 2m5*"~y F/&H1x5IR@[ˋ/are^Z| /(Gϛ)xu'*OZ>Ga!%GXO kpQ7!2!q(N8BVZ*ՖPpZ$-}L)ȐL9LI(^ ~+( `Be6pcgRL)ՑY^WB#"F"52 Sy g *mdL3fŊ_6e_p;AUo 3_mu~zon`l7r`Д6Zl,cˢ*ZѳهId3>9M>pgy!7-Vk3߬ض&StyF骣Où` E%rfL]OfL㑮fL~smtl} 3{cQ`ȑD"8Vs#5"CNqQ=1'$`ucww^wS%][D,av=Ux< ć RG'Ɍ2H0l)hũ2;~:+Bڎx_\ub$)/G1VUߵ*S¤6{ɪ ?^w7[+P>x0FakJn;Z2drt S "̡%$": ɑ>gԁ$XEUya 9 fzZzv> yj>jԣ93.+i(ԴVf.0g1 IT Q(UX-5#ג8@-i0c?}(c?g}27#U4nynww.\gR)ΟIq`<7 ̼=Ta-%}f$ 褨HXk"S56GA֎yAt)(V'Ci{eSP?CO82R+ć# uˡ,BP4Kwyb%^欉W yXTf_\_}oҬA}|M ǀ?ۈ0xgN#L(r{ж6HIptJĀ32j{['綮cY#q=" Rs[[P3x ~|zÙr8M]Smr-X]OBHwmjI%G'zmtK<ApxIcI|Gc; n(Xk@Y;\?p('Zk)s5>ԃF2G8]͌aB-kYD[]LISm!յ$?ʻ׋E*m\ͷn ]rP9mr1C; k?o|q:gJ)l.;޺(Yzv#YwXnC{}-켺{c=KV'_$\zzϜE) 7֫ ,İU+H@6?.\^bc@Id?}DxwXaK1PK {]Hf8B P^XI}RG69oaL Φ+ M%sHD(I!%bv' >VMlgboY>;8|zt*Wp6R*SK]jߗs~֘7l>dqޭG.K t-~bql4%XNsJI_4ā_Wl/.A!6Xp`?/HG =MI{GQ:z;!!5JU 9qFB24-Kfs_(S/`I"CGPJgo"Cws݋g!P!P@`a(p(AP T͐F"ER!pnxm P̾C边t 82!]qum5uTH;-m$X c=`P }"Xﲷ bvv,`A &;󴖵6 H'ho- kt F1FHkd?H2SFovXBZR)rr>J%-.HĈ:-\LKC>s-d;ҍcDOA uJE=4MfE -݊7+ϜExJ8w[)rX]cZtJ>83g-hE9԰AT&5f\ LbT Dy*X~>hIC9+K=3ID-FGH}iLVTih5Wȇ  {f{01X>8E:#mǻ ) j"c` 7ZnOz :j=-|ndPl;c,42xORZǐH S$($hlo&c[ jv!zT3)n w#J5 B'I!:v2u8?>j"|; JRFiqL7 30Q#-4M/-{_v!;io#8KūѼ25X_blR⽟_w?_M`U0bL/J}Yp"f-ΎifM_w%DC'ocS1A SNаL< 0J D*K2D.Mmǵ HFij9oTQ%\jbD 1z Exg>7u#eM$wDYm)"I2|tJKnWDĈDs l }LB9kʆ(ͦOi7ݜV\1'?/ 4i!@#"x^id|-dNë?4J!vponk[Ѹzh,j@g/b9~"!rT6Oq }Phv\/dg_a=i 6" ax Grb:yݝ7Nx0Bu]٪1GV`- 8Q4I/cg fBbXϻA*bo\ΙF]Y=v|ҾxBb;6srU 1G6 aIP,])6^4IzzgEa㓤R,*;t#UY AubjCg/00;=JJDI I"ǾaH;N s="i({ϭ}$}[O1Eq6gĕҒqh$qs,+$FkKCqu5U+)[(p aŽ[Q⹻{jum9b6`eiqL` 2PkQ{lSi3Q-<¸f(iJslш)4\`i!zJD螶9}R2D)d]d@f]vBXW ]DZ*#i]yS)ݱrV%#}OuJ pwFpE6g~x9+yw 7ONΘXWn H}`9+yvcgZQʰLrMp F9{6ws?I'ǹ2IU0!2Zk1qdAj-ܳa{NTt[*hձemw0%Hh(~j)`DYh;4% V+C5mmU/Vkmc#5#0A>N5S@P/;aI` .-\9+]3l%cy D#q(2 sV6'Ӯ q7襜| V7B\MŹwVut'Tڑ6:2FG|:2W+2+kjcC&©WV-(Vzp[)VOmBZDE&f VQUS0b]ˏ>z_n2-(B"vG=)fm`C~fMɨGzDO)VBa*gf̃ 36'66൤Hd``$J@Y Si wb-PY;lG<|@נ0xP.%=aLr 259Rt2) \l'<`vB +|r0vpp)K'6y|CbƸL ,jep՞8iŵf5 8e?bPSgb:V*JPKe bx6XAwE;Qɜd.s,,?%-O=X嬑$=bR򴊾o'6!D,,[tq =\pȹ{~W0(`ۅ-퍷ϻtVy/\]mñwWgZrr[xxɤ(zIK'xK_ZJb.eD˚68 D,Ohӯ=B(Qq`XTD[w?ɬB씮v-Wx7ʱKpm:F P=8]k"Ok:Z՛huf=5-_k `4Utњ##jMBi-#(\)B#qKwUwDNL1' ׾&Z'XZ8<Ƞ%J?Z%iۚä^ƾ>9_cSVk^{'pOǍ˿"Dmއl'01I&hTI[";(%E#0vT>.w' T)d^+ {X6n~u1ԾO|}pC~W3źxpLjm/ wk]倾 )wK߾[R_9E͕]~b9$$*! g u`eփ%+߅;뱺7Vc;$[5mm/'zgKY9OI,n*HǴx ,e>X6u]R͉l&#DDhC-$ĘrAމY"(%)ϸbP2?>@c^{RP.F$6#5tvS9xDҌFp缻./Pup57PމY.úu%ޫ#ҙx@%eh%PBKL" L|LBR.cmIfUViI}j:5A!qw1էJ^\/UHGwTWJT)HW ?|R+GcE-`5 BtcF9%'fڰ`Ɛu뢀ժ9[[u<\PMw+51m.yzGHV퍔Xs#?ҵ~#%ZCLġL y,tpW#9 !SВzT-VDrC9S;u&I ōbCBP1~W_GPW8 .+Hpg3̜)9͙ř9Ʌ5lCEp>|ǁS̤Т1/&*̖vzM@##<-k!ĸ>Q5*5luQ_.h!Wefs9|h7siMA@Đ# vʙs{`.mJ+BVXu蓗UG?/Xe "լ)"Ӕj:E5)Ww`Ah DnW*m|K1 ΄ q`j^2"f6D,֬#ĕx,qBb6o)N XUR<'%G6)#bhG[U!vE/+j5xVWm;WAϋmN]N+)riFm**vo"-[Ϳ^]=kL+`8yCM%HReRKYHr:\u,'b*SF1fN8rLa˰b~ %Y7P; G<_? Ru5b_8H\'(RJគ̗dLi.<_-u[HoN9'oG#=}BslXqHBhha.OLZa͎Ke_\) rHs^\;hKS :e\ 9u,cbzv %,ڴ3@-(B B8^Fti,K,YG.JXX )Z0ұXH0J$W9P~;UT PouUӛmW}~a?w[1% _^8wggE-WwF幙Rn)+ <SR<7kT;FIN=1!v"ePiR萾-/y$i Z$CKECvϘTҥneEU۾c-O `A0z5xt8L"Gc\_HGm8&ib>Oɣ6(L NSqTՖT% Q-|ZY\cWZ!m*ns=e(Wev*e1).aȢaQw&bWHYƳ'#E/" E&;_/r *eОBLΫ^TF| |vl{=0Q]ؼҵ3[MBr;_{L[7X'_oZk[,>O쭽\N|CU_JGFxNn9bi̲)d̟3vV>Yۼ `\IShnDΕ6ʤ3$#*E INP\ɿ<<W$xz=y匿ftО.zCۛRUyv$/<)0}]jv\nLSg7s=+9J˻1 >JOϯO% s:m$q&>\ Ǖm!X $C"+4R3 QW8 d]p!XU\Z)qRi3$׹`.a)JªBLv=ՐxrlxC1aLdFrnse3Re(!81:3&aUKF糋Đ{!HD@ uag?Q4ǒ9-cB,) @3$ 4V]$T 1 HB"uc ۱D. CВ:g#Э Yqg"f ?LJsР;@D0ΕzL($Ea@-V` q1R"3@9,8*T e )!R !$9 DقٜSF}a1bSdBk-LNA'ٌy)4885$$#L)KB!%P!RDyW2p3?I#S(KEsG jB3$bS(2pwpAA  x%Ik(< FsD)Pޔb~F7_lQxp]q53'ı 'ָ\e28C!,bB56YL!ӆ+Tn"@g˷WƾA)(P?BݬDɸL`"ϑ>V3_vYqg/\"]s>i6+Vݽ [ "|{c'7+G., xbܓ{1_,7|v |?^s-_o*;WDڛ`z 1T#FuJnTR:iYR9+ԖV]I$ybf)ɔ& CkBHG) 5(J,Q!>o5Gݮ:>txJS͕UgXp,uƊB%|4PljsF9mx2vb)\T+B"XR)4DKGtOGtY9Y>OM;.k?m?Pn.:]nxf/拫!LÔCPe7ǐ\1D|G1 GTDAgL| q$;vebGPС. %?BDRwvÝpr%fc3.fC+ŏt U^7 %Z=9#D9 m6weMvYKŚVsat*vp:p~iv '_N/ff~o$隄mA-x[JQ YkSQwƮoWnX.'(s[?d𫟒PIFɠ& 9:9;ZUjEY{ $%o{{n1e vx͟Bq.W^R.JQB8DGTz|^V;>|^q,H2t``1lPHG^?a= ђ`Q. =lt#f*={#>/\{'f6{DP yGKrcG*;P|3DSF./w8b,gQRցBUQvWGGJttV"qqtTGFX4 ê6$ zx$Ԙ67}z@Q*{,h?SttY)<6p#z:s#9 ֑ӫG)r'o3%D ٍ8pNG=v,uqМ ]pJ4c'< -0:d5J5͊wm͍ܸte7iq٭2ˤ\lro+əK*-٢n6[}$5Y&A8R*v@ikJnOw1y1515qok\uxƵ޹˟sK=JwG) 5hPUazh/U]&Y:T Ek*CwlaCfew!7i4(awn||}#f\N"9{ɒMVY"ߩ3"sp$sQI.7ݜCGkSW~bz/6}}*TvA]G~6aWÅ۷:4'`,<++;6Mlc,1wA#ddnf٥cdGpI/{*3ͳ]S,=WzYz/,Ht:p}wR6II*oYkTp^ ktEU,uWTOl(Hk%Z_QA0f?z%h \AaؼE TOlBD @Tt++j` i Dr%`% K[p6m}iP6U۰hpM`; ҟkS5shQDa{EsD~PByXp]ZpQ~Kxofx{nY&7WMFǣG~77qү?י;AhǬE_FRK~Y^3%9z a|k>}%wyPO%)H2bʨN5>IW^{˾ /-f2/Ge!cB1JG%1+@RWYx/*PO$#Xm?OebHZۄM`;q`Ѣ<gn2b澚#[q:GFH*s*ز[҆X2D aZ !ٵ"K5y,Zulȷ-q9&9Ld! Tgp9}, ]rj-rݞ{or>d:qzNeu餸 M0Pݿ).J3rQ| ~_Y "RDP?/?%n<xwy9p:oqZ tRe6};7 tw} n['>vruO |\q,Gɧlf.~cʵ\#$?w)l]+͖̌.Κ/Ҧӱn!3K0ppD QتHfO#OL}bo3pD 9A#MBY^+/ǯj_fN߫Fipn{/O*wRo8O2-lÎdcJ338<ČzVJ+:tۗk2dz77=EXw h6,Ro-) 5X^1X-Xߒ3= u?//Rxgν9zP,^`&T^<+2%όDıfYAv2Cȕ$7 h V)%zjޕIr6ܔzX*$W3 C L"h -c&  i` ZW6BĪRTcb9[6"QD0-ʹSf bFI&Zm+h]z5x|D|ܸno 3B5<(dRyf`nbD7b A4@^`-/3G, Uzf'\ YJګL?Tr 6M{){kOދI1coh)0MO!wnf`Xr1ݞc\Z_rǜg&O&5_@}nmrݤ7.o/uL[?ҾͿttπ広eA#$Wy"SrR8ːn͋dp KdF]@S 8} >U͂ I]ϩ@5HG TJxC+}wY78MI#@?Oo`,L2HI$ FdZ唔º#<R)BQMAG5"]~pۮk1\Q@߂~yOs^έ:-b0y9o\ܪY}0-&`ĊWcN%x,VX-fԴ\7Dqл5>m,yrD$^x/0~p{r+?.Whz/w~8{]D<9{K Pjv%Gp R@;5Xw{,*y*)k]^ ov { kN0~ QdNS*"$Ú{7_-G7Ę?9ۧHCjJ%1Қ 1|s;^3o_ҩTBU0%k3]M9Br!//+U VԦqjlS!T|[ .tWۣl#T!ݢ[R85XцCR=sժ8+ی4g+KV 6(єsr`pvBpn3#/Vc_ WfV`LAf?ECuXepnK%6eTs/&TG%ín(1#T[z܆"]_g  $݀U80o9:~uD)c 8*p8 ^2RZ-Z:"'K#`_AnW ؟gCurA􏦗cc>ά&1sS<1sS<c7x' A2S&sR-p\.4\idPWOϻ'诓l2eJg@'ɻV@&s:Hl9,1م 4]$kYkrM tȇIoݫVEfD89Hf ))u|C3!aKֹ֣Ym̚q0J䣝L˹SJsz@3&ӏ_;򸜎܃nJa?*314KL_GMoKZO`1OT[Ie˵ZGHIL4ͱ)%pVU;wlb)HL1R$~1%ts_ K71a n f3VQ%, 0 S7U9fp4%AWd m%*TcX*AGfoKg0q?$|{˫G(]3ޡӏn{PVT.X8Mϓk^;;A;I W]ɨLg%u>_]pv¸|y[߹;KwQ`^{t} _' _>hP 7;]sԆͼh90u1?  T`ri 7eT؟$ǝC"2f&7Jg@2S:z2wTrPZh0 9)Ɂyji0h㜇 ^ HJ9 wXJ3 " hW1 ]3NY=)QP= ^)" Z˄pyqHCДeǺܷ԰e ^k]QZ(̓h㎛2"d'SIP6`IZr12?,Xmuׁ#ƒz r3~#wfW徟z6n<4P52$z2*?;ATf(fe$IlmaYSga??z\/1 $*YQrJPKg@9?\![/7<7S4>Tk zR" 2`jZ65y"߀snoٝ;/|xܧ<]A,!4]?nn?yǫ/Ż&z?L_=^?+CۦP"d9W4KQE檕QXsUQ\bqE~wyNMS|IK|8p at[EϣЮ=%/ya#4Tm&W^a Vt4)P2[;fGѳ/m-WJ6 }ܤftc0 !Gn CGT VV[O}jkHǔ_TrT9U⧹rQM;L@GSgy'0@;0/ɐK|Rǟ?(fɗkdaϊU|nߪcݚt%C @ Ǟ́ʍ7)0~mg@{4^C)K?Dk!`\8኎%K$IQ4%gx٨Vyޅ5" RL 3z^D -Wi4pJ(DuUur ")Zִ֯R{Kճ),@z&Xx v\q?z^wU69 _6U֐.!V+%J sDeX~ΝG.xc|U~o_a*$tKDL r3 >X"/w\VREϠޑ*~G3p}#Uh*R%U]*!B7:;5@!CОDQX"jРt2h (@2&cxNh_NAӭ@b\V㢢59_EhYdF\d%PnYݳ^pGQoA&/ _j4|v 5 ]Ug'ϯ'S`\/{TN.> ڃճnmM~/n9d@Z/Pw[{cQo+<+|SD)zG;I_ {/IzEDviCU9mwk6Zb=IAjQ>Z2<1@e^Va,Ңe%@J6a?  ;889~9Mh *nMg Íy5ܜ)J.$fLC!Jw>Q!nDG $ht AADC:Rp RB> 8|sMtg0ԣȁAtOl?mEZKVQ5ag+>߬]s*rAղwy!\C.X%`SA?[{yFuU\wx>?WwN|P1h]Fϟ~S;aJm\V>b 矤j}:=;ޥ۩glevQR;IG|w5_O>NsXKu7nYVu2Ʊ4uM+wX047q9g Bn?27A>*Ѿ+9Gȗ|^!"!y,Spn6$ Ab:n=:\v&<[Y4$=4%?&Wo*:\ r3"_]Mn0._#BޖwlSwn@ts׻0Gg߹hqB93d􇝧o㎛ @ޟsPB)пJL4;~Yj b??'ޒQiqu?]M|8A Oew, g]8j-N',JLh)h$"E|-*PBY1$* MԐѿ|>%};:we\F QގyG2~?dߛBEU%bW|"|w[7vӻw'\!wv~yvw;}s5I,WHO"&+j<5y;M^xN^*@ғ:y|"!)ʩLuD*JD[;Yiz4bdHkeৡ^MDž)O܇Q S&$btQHbLʸ4!-e*2!ʠTψïr@~ 6^X\pM@ˣ5dיCؐu(WT  aXô V"y h{šݻM 80[ݙ;ʣ 5d\lN2v̡IaJiu*s͊}!زjGֻ"^]&3g5.N`۽e9NkSA]tr7E]i؟' ᜐvOP׏;9_ 4'i :sb+ȣX_ʱ!˺Tj)2l`0)cLhrR܂HO]Xx&S; gd5Q9azQ~ ~RuFke$R0by$-OKff4ϋ J݅rn4igaX?jI%Q3Q<s*#'2vlDb-rݟJXEq۟[D[4r2έH\06ԊqP "=yƎP:;FdH%Aɐr^yNKI 00͟o65/&SulÕBի( Z6NjI= Bڟ:`0!]Y/;]$~lf1؃`f%m$yb)d-K6)Z-AV]X,nV -ٴ*%h M LICb(zŘE)CXTiY".]p_0edPn*%?Ԕ.u]=D3*!Mkt8+c2R2a-UX6dJ9o`[)#l^(0 E8w— Q#P҃@YFIy niT s 5 dk *G`Ai5 jaH#$7Z⩵1N M—A+T[ڂ Q^*b<|EMՄ^Hx,pëzV 9D-@qӠJJh4R-w3iH4KiKp&WIۺrΛl'|?>;T 5w/(k-.!Or˛[$K{ TH7Y)|Vl9؇g7ŷ}wnfw.f[jDFT8 ޽$ FQPUvO@}6eh[seq 1U>|P g;C2V}d#X07^VB,C͋l`bA9d(՟ɸviZgN}ՍSv'D[I"ЮF?vd2;jH+1+BvLh8c6124Ȅ{ފ$1[S ͮ*ǑPMQyUڟ s@̠ȝKNϤ΁Q|%-.la4:zh"%nYD==4' Jԕ=ZP`υ`?鶑9ZE Veٝ`ra#<<'ң>]#=avˇ>jPMI9p8ak^pnW 'Rʀ5H4zFOnL%[ݪ:~ÚpfcAOY vf5[`>" p:Ʈ1Úz%أmMLwPP3sjR8]I5lRjc]5  .ڜ2f;q$K1_xo{uGėr|# ukߔTB}]{Ր1JN_k TQ;MxcR6+lsCCeDw/-^{G!}\xFݍeH/Qn\K)9tw׵Gu͟Kk;JlSc WCUs/j-3(#E GZ"dƏėAT]V )/OŔ{^@̔ݩTsUJڂ)& DPN"HaHM rkǘEeȤcor҉uow8uu- y9y:"$6L#zL4bQ' Kb(7}QԗUON2|q3 \g~ TnZe570n咋}F6X6U7a5! !u A P Z.z~22uBiu? a8{yΛ{@S5wDQAi]z z)EYS RZTQ UXY (Ks_  RX*^0 E,+-q^A|3|(D[@SxQV{U/||ry05gb`4>;YϯpK3E%Nly?X5~Z3N/*-?>8~9~ `˲xCOȟφkOͭțwy ] ԐtUq@N[>AMWKw7R`#JnpU5 U%CrIJ@Ns8riyŪ1*R:'z/iJ}hVϭB;2`$ZBZ냯[+shG [cE$5UVGb DFб%uVp<f0AkdfA'%rG18$QR([E3QTiْUeU*4?eΣKXhG_mf`gy.&h,N$d/B&:Au7JH_ g5ac߱wFy;qG_&͢5&gzx2-x%pg')6Ng?, *ﮯ@&ŒF.YTh5`p̜̏oqàR[X04;>qYHOa[q1<=OM>f&^Ey|vCJԷ*SU"SXn's ƈVH _ )* r1ܵ@0W_!%#]hZ$7oW+ @Y@,z$% 8nT *.98Zrkqb (,>^3BF9,>K7 gQ>2!݊e6o&҆𛫫RnZyc XJ: @{s1 jN{j_T !ʨr!hP!2vX:QZhS ![alD)^)cp.q%0/E:_ jr<iiB"r_9M<Βl5^,R+vVmƾq+;_$咙>l^5x~p/'Вf|,=YT<>;O&F9݂WIHI"J b8%Q`,U--K&=کj0T8UZ(R@"#Ag!J- 8wKZHj͔-3%X. F u!!4#U$jD$1Qh< L͖ S $=YjԑQp=  %h$Omu E*ŽrL$&ԁe`1{Wc@_jl[Dd.ABmUeF.HEʗqtR$j22XF+ip$q(=\(@!>!f>2ϕB=̟u6?UwVua#E)ҿx<3g&CA+]+$9'ATţOz.sx7 _wR'H_8&l d8aSY%7i. #\t۵vG;#pJDD8Vps);n!U_ \ bEwMd[i5:7* 1XP繎ww7|8nߜR2b7UVJwr_f  %wo/h6>| g;qLߝ0pc?BÊfbF[s!*;Ȏ ЁW qt X0 2PbEST[ lΖ+Bj+k4oYѷ }en`d 4blj =zr*!2(E(豻8t $M6/. ;)hf7Go Ci1r&\:"!3ocQڿsԝr5-u 0"3H<9 p8tNkWõD~nb&58k?Ԓ;A]yؕSʃ琗Ai&`lҼ{`EW21rMxb?} {wys_uv;z6lY~P:Gy\pˏ^\̮*ZCv)֘=S1poۓf铄V|Foޅ넔O| YLC=II7̸:V)t۟viիnLV^<)݆/EKx*my}&AHRN]>B[ybtCpSBS",ddm l< PA|1"^ɃycuVSh/KbA)ѓ"Y `ra̙m=pK/g%Uэ?=3chK9Xç<37ח[́;yE4HQ /(uٻB ,s {?n,I cAuY4t!>9wUgkPbq|m’H־r'7L(ݗp$\lgD$'$ɐKPI2rGJeYMkpVrJ"iT}B}Muu{i4F[\5ޣKsmЏ5TK͋VrW6Öx /L/cW}%KNOVrlKܮ7n0D=^ޟOY8*+| Tq6vv[QRs}ܶ_iԏFB2Gne6hÔ=q=DKPW]Wu\RI=s%&Dk%Kd ҝjjl-կ@NVnйbzqtZӃJ'gPL1a1wйLi&ư'7:6SY'D Ta *޼}YLZͺ@'lKs[`Hd\R-$RĮkN֗`~150P0 -9O PU?voGd]F.m_39ιL[{wZyXU3 fR1YA`aYq~|I1gth2i=E'O!l~_wO&HU:Q"7 I+:P@R~^_Tw{!7(j3w{9p3\stz5xT0Lj;X?+ys||ωbzTX:ngջy:OKeU- %+)LsA 5w\zAށ| i8ڡ>ٵ Hf5C ǂѠj%ol N֎ZR<i3Zr!̛zbd/]o0tGۡ7Gu7G`wz`]/>*XxY]0:l;䰝Tn'-OnwLjB%DVLYZY\!*7Gz<ܱvΤ\*NDRKKַ֠2נL*ȩmfb _] J :zone!eEv2zAG$ft4WUy)VUYs :0*ȧ(Ply{ .ӑnԒX%]ͦ+sltY膅*ztL*GCEAR;牿Rh 0œh#9 ke0S hy'bL*3fD[ƀPl40˜KjdDBOh5'G1a-Ӄ,:Nejr)g.K^O'ˎr_~e!_8hmRc\] =FT V}h+Iiq-B èꍋGͅ$._6^WRa[1\ ̯R#lU;M2&4y;,, mיǘk u Ru"q(`,k((Ufk~eݲl)Ö؎/N]XlX\vu7B'QD!tI*VOpJGqZs3HqcFKg `ŖqQS=zYK'z>[E2b[~(Yy|߇m//7'S{n\S#zu:3AY`,N# "jWYłu2ޝgƲfV2{1NxT}&x-XRJkA)Kp xX(=֎Z_b#`[ϕŷtB^hP֑}%(9{$(A ө8FvIk_?N}H^_t3XT%v-^>Y‡U%aϧD ƀQ!ZG엟;2_\HG-f=_I,Xɞ߹\a ,g/Z;Jh`-UCxP-T-l6ŒөәzIK *(ϋ>v=[k2ǭ##o"(jvӣwDL>jw9ZS2-s8pJtzf @(*RfS= (JI.fХy: ]܅ZL &VWù {ˆ"=x^nF£YdRͨȑWOcujw򆌺tM -FǔJv jo!Pɀ1 /}9kT֘qM,F7Bn2“'J'b#E9 beTZђN\kE{frA!JC8fFR-hx{Ӟ>OrpX*E16LSa8Ɗ?IMp% XY9Ak2DIjkBi7Q7=nS[ŁO4Gai& XVQ)VLL̈́u#lELD&ةHqjU rGSЃLڥLִf3qC"jmL:3gZ҂1R1BSx]J!hXώ%Udym|@L@d ʀU:U=$g0N%O8)YP] 'SFG'@= V8Axosr e!^1BMPy*fm׳Dsdu^x]s XFv8G՞c"`8(-Uhj-#@`P3 ;A]Jd$N./~j6%IKy/+79E[B)f,LNR9]N7W1<*587߀1q?RTaD[Kx:+|PK1n+AR܅|LIтM9t#bwՎP(4JE9&]Ko9+/; LJ|3hmxmb.0HZe$ObK%)REC_\R"AFǂL~:*NTj6O]%'W[}ܠRt -8܊F3T!F~P%<5@y'Lo(LlQ3-UT`UkgpzvbbYܹP~(k"Y*],XþՋ`v -fCAzL;鴂4 ۓf-nvTA0r;9nb_ A Mf_FK%0BsڣTa'2aZX+7*mnaխJS5F6Ë V:F Vj y&zMItĸM܌*fF)?*3%ƼM'bs0pkMB1&6D`*,ϵB' 53Ylϵu|Աkxe6D5 "͊}()GF-u:HRBR^thX=VK01m¯\QCh.uA c<70OGpn>)Ⱦʙ,;x~5>w Rƾ-d*(_%G˜{%mKSJ!on5HBbE_b/n^ wutΝ5? t5aUu5 K!k #:y;/jdERWV%.);YI7=ƀSk;/)7.b1.39ogs8-PMLݛrVɜ6}XtzoRNO^QgmDA,H!K4PKC}q 9Dj3rMr\AڕԪ̻)JG|jJaGvܖSq2F)*kKQҮl0bl|ejzL绢`%Z)t%hc-ʆ5fԢ9F ebִ|9Y,NIw+ņ&9|[J"{Qj0y ;R+m:;]w:]dH4[%Qo--}  ;6?˚01+r D [jik}~uÓv-MZ0n|JNFe mZmP-@.. !! }ܙQG2#QJeetH1)b45A{Pq !č;1J3bL;G3'7 ]B*QЋ(pʒ=w|4Ho*)3VZFVh˔߫P' 8GW4τ1U%?g6I> k|j4ZsrTsv)nAp,hJe)FьLSN[iwxt +:T[u6&R܉-/)ZbQQkoh} WtPP!ioh[r?Nc)FWW-$E/=l-Ϥs3;l˧W>zƾ4u4\;a.LhEl ;ɋԷ(u4X6NKĨ4e/#wxr$2ng8a`QXʀJ덲$<o[V XŎ"$|wE%P qc2 VvWrR9*V`M9yw j7_d-b/c8R(AБs^S#rO ǵrcbǂaG'}vINPz'io\WQBO# 8&Tw߈Ԋ<ޑ2x`#b ވD/0ƞH/@vq)ۈ` / W&fK VOt. [jLN&_dn%єɍg7,0K͌ %r̸x#s%4h=:1mb.h[>,(KE VbDy{e=! >3+,2EΥTK8h}Dl1ex]R>ǣ_..w~v8*WOO(sv㥽:Mwt;tc-#\]w$aWM,^ww6cDŽq9HZ͡TwU(oSFiu(O3iSMDv;|&O|c{/{__|_gsz}}䄒c9t'eGW ~rqrv_vB9ٳpPAQTq}sS* Ģnng^s^n? 4gj8s?y1B^Û~`VAK#ɑut}PCwUӡV`]PŮ4V3<RV3^zI t3ܢ`ph1bOK>l97W*{ ,#*"0H4h;RpSJv#2AolC?j խT*>ä́5qlvn(Q dxpET7^ig |^ViWwj&mKZ~!XGL HEE'L/;PBt <=~=8tлG NZ#Zyz5)Z3Rz9ftf|2z s.+rpyl't,ҟw/UB0ϬmF{ Omz+vY&?u:62s&<&*17H KEId@_[~i]{㶴.7E b`YleF Dx\I':/1%(I˹f1P Ѣ(ٶJJ!u_AA\jX NhmCg3ߺF z2lݍDF^¡Ԕrp:# U r~j(q{T}L@Mʦzʪk胩VD3{i0 ٴ$xKTx)z]V9KJ%ތs=Ukv#ska!D)J)..WQdhh%J. dӐܰhhRhHQ(-ݘ8U3R} L0$\g9%<($b/\!mejF5VJa m||̐vc`jfz1bؒ)5aC39FpOe7@N]Wz!PǃŜZ`T$)EԎ1^KP&r;:i.Va P5B7_SvB4fj9`vh\׷irz׋&cK=DysQ0>?.+I@+1_%((:22^GWێq+puLn0[M<0MZ%H3|&h/ Urzu?sZgw;Ƹ9L7ŋDƿYR{,OJʧ?@+o^s_sg=[CFJD3:5*Y ˤTJI7ф#Lr1{er(M]2>x])=9Zi$ 8suwϷ0~J0~7˳o]"<ꨉfwn-l#J"@F9̏D:P [=D8triƫpaqtM ' 7שid:8iN9y俆b=<ݯ_qfyr/?ƍG? mζ?'|OnFHTYGE|˙;Y}\,h2lԆf5O1SMucΧy.u&8,l E7FLTwhiek(.b-*(`̲'Zұ  00Jq/"!*'{Je/U{` 5&VT$FcH @u`Q}P}s]Ⓡ83@RNbh|:AT$# N; ȅ jJ"e-W@BX{\#G ¸gz㸑_lidEҀ_6,l'/ MJtƱ`{Frk.{2#Ya=d"YE?敦hyebUSPWU4kCS6rR M2+V9IEߓ" RW;{#t}LG8RJ ww]jN+>_HBDYüJ絊kKTH6L bØR< ǹ>qw!deq޿;ɻʅk}2Y.l,ߞ}7ExWo^XKE~y$;L`ŠjS%џٿ~|9|#zʫwe)dQiko_#h+Mcn8n(4ɣVڷ LhE63RJh)1 OƌQ0\mRB̃&nE70n(TY+ʻzOhF7 DGVpvxnO)EE7 '%_i-P Pԍ-gNFfr?u"q 7vR34 e̒m M16zlfpϩC\j>xfP@>u(y^$nk4CgäפBN{,fqY=mc(07OhCz0&:w᠙)זZoj^H \XFT*HY&@8'lmf("f$!uLoΘꎌ0ݚRtA`A:1cRsq ahfJZ^ P-( -ucz¾9+E@o!LEahn-+zy0zfnw/oOb uШ(:7t S!Q8l.hNdF aO: :ItxIE }kx1YNy>:H d# ;t6n:ec'tkr\>R[k)F[񘳘$Զig} Ƚ3^ "ɜQRJpKE]/5܏A:6lDnh;E:" "K-9RDW|A[{hL1ĢAd5:@LFkir(TO=v)I6-bQ; ¤V$&mu @['C&$@Qfk I P1`-aOyLC,@qCdm,;=&BMRH+wZGow%9 CH1$98Q i'Rk.`l2B" )K7h#vՊZ֩;4Zi@cNMBݓpVlP17}?tkūV(&PLUkM2{kG6{jIE7Eg= j8Zwfxԇ 2xjKw\NPCf$Acb1Dι $`L=ybAHR^e*2!aฑRkASo8f<'=w6= Y{;~0Կ{Lo@j HARl=l #ݹ tnv+ }H&deqa|*[8nb ud3_䓝\%%cՖ:nTRRFKCeq :Dl ݟv dkw*貂/ͻ澎GwA; og'H! t<{3_p/Jr|Ia KXŽ:v_)X|(;|(zc!Pt #ߋiy(9b˱_.Rniv:hKQ:or\A,w醂&- HmGƟVdLVBL~2Α6FpgǛ|_Y&%,K~|}F(Xx=>o쏜d$#'mdYB:lWI.mS$t&9RptB_snV!1"ƚ~٫ˆsiַ gI(W9/6_mlUᔬ]6ZXDgZ-$C5:<'&kGʻ-:sL\`X]5NN'Kzh~-Vp3uq]^͸g7WIf}&Aw$Z;öںOG}4b[}w.[TzfqNnWY4a JvCA߁{;T.a\({鹘UPV(SP?oֲYIfv[W1{?GEzRјee=zʃ_b6˔lkC_@H۪SXZ(0eQ{;ȋc -d}!:gXmNȥ0yfY]R߽dۻ?Qy/ F/!T[!cEV26S{y44&9~R0@NG/H Rmt+"2C',OIĞǎJ9Jj0Z48E 1pu2(~)v* "zB3 7 j%6YQN= LEGÛtDaM dA|Ԟ!w`:c7ktB<^._D¦o[hI.ťB|ށ. }VDThj['c>eHmPP& 6RSv4GҘvqdUVnh4FZnR2R[#dWB)ʠr{2]mtUR+ rߚ۳ջK:5<ʹ4\^79G/:@%}w^i@7&M`-- WFn%wZܰ((ZEy!{"Z30asތG;Ե*Lc$dBnӘ=ỉft:hU1bBnoḡP% ҫ("qL3/瓎 VVbc.ЅN9}[`m Cfd!;L"q2FMb.cig0Ib[ L FbQhSNK/0Yfd D Ez1$蝼ޗq<%TC|*50~wBpyE›$"`9{׳N؎r|o%|N-q~>:w-*!rAOo۬ӿ:McuLUl,[_n,6>oOH?oOӼZykzy.|..7=ǻjY~`MjNGrFJЂM]QU)g㨛|meߊJ/@<οlte_~{f{' A&eSgǹ߉^I!|E3d% \{{;o:rVsC7(R+Ы:6W[h9 *|8y,}D5o(^F7E~j`L=T ϭ6N^1?^ ҕ,B5wZHI߭&e%7|ajrOsȱl}AntKq, Z"_nR Tb?ҋ96I7.]7 ]DKҩi suT}(k_H6XTyu%}J*4u(b\m.ӖxV>qW ګĬ0))X2F@f$CZ#C9Z)X+,iVg_ }LF3%a{ң% qw64i4Nus0CM0ڝ@ϑgoH /8\S xV ZbF)hr)^[~-k1gomW@Na1d:h.J2O+GrWv#jKk{<]Aek;kOGS3-ʧ)%gSk)|;HOU\[{8sQ?c}4AG'-c Z%@yZ-yz8_Xnگak$Z1c6'KL2%\f: B,3Ғ.v8ex:%ݫgp0Oǃ>qk<A3C}42 ^F*-9M^21.D}P)c, iکHvDmp:#s; <ظ78s@ )v֛5ixFOq +q Eex{J7\H 7wOU}zUCdhf@x!b|h}0@yctWpy=բŰ ڛ˘M^E/u<jϾ?{b-MoxMztr'א{!]ŕ $/Vok.Je\K RR~o9LHz~IMDD%Q[iܖTPR7vQ\*28)tވqwNTmJbd䭣D3&B Fh:)G$}؍|!/3rT3pyEJ~^s9^"I劲\A[͚eNUJp,AkC㈋߰h7:zߒ/=O-ҝbGR5= do_xU1+Oks|j/65veFʮH LW:bw:R"Db`F9%BzTƗ+# Di"O9ܷ/iZ%LΔ: X%1ꅲ2~dHD\P*R3A\:R̜2˕u$j "jYT}JH0Q3(A(̉!8H;G5PvxW&Ppάi0)T'L33%A3P9Bsn=PAb:eU`i`諈I5>#xt0*hQu:V Ԝ  P.gX׏N<$DGN_@7)Šrў<jňdxH_h;n@`(Zp^DIn\JXɍAV[ELOyw.?ZE-k̚(1gH[** +1&d=>2q|R`0d;5+?WCh=l=Bÿh٭y |n1 x C;jܝx>v'.˝Q`N#VP>] `g|9s>la(EޤU$|k9Z?+uC1s}f=L7wRN*IN(*MÂkK(IŨ.".xRNCA Ϯާ_1Iv%a1OGdhIwTOԻ,QϦCM(9Hl@9slZ`#I{OQӶ"_]r -k-pͽ62t=P9#AXۉ>PʩBzf*!!p$Mܓ< ZLyjwWF~`+"m]Ч;W|)$/U6DWg}#Fi%xܕne QݬȰ":Wӵd_:꽖ﲇ?jB)R8ZdA[do5 T<\Fpxԭ%PNJn˚iB/QG̰LٹV[դ :%.,܊`}yö+\jɂr W6DSJ;PY!ezd7<#r;g]Dž7XG:Wj{7<|VN30"##W 7@JF>N30Jf3᢮;z gOa 䵦c#o! N6imQa;NglF@^1`z潰R/hxJcrAv#mƑ(}E.,b5BERBP ensHe-mmyO6GJ2 &̙ܵ\Sv_{w /g&֒^e榭x :)F!DӰK`S"~jǫ~ZI7W݇~+0S1^XC4A= +JZ0̀Hg# \ '&SxZq}w]kSQ6]%u{ . NؑNDG " 0w<[O}!RR)CEp+VhVD=t<'Ti7]~϶{-c7\"7E ; K;8mai_<댫m-LHB/b>y}<r y63B<#9^uEr8&|Ι}.h-k<;xǰak9e$ fE󣭦M~Zx 4)L2h! z[±S}Vh4iA4F`lZ" LhxטZ⃡ք 5X}Bya#wYp}v[;7ĭtsݫ6@67ߝ-: 8kg˨U8?z7KQNk_rc0a?)ۥ#!_.S   )=+zye=ؖA&o=_/Nr";6)s{{u:shaì7Fr;zI QŤaaL Qw& ~c,:&>s^A\̟/Ա jtL60hPH>ґ'`J  d 49AGdyTm- *|՚ lt Ꭶl\3eupxݧ 攨RPx\i@lL?sb7_nfQM:dHe]Hϒ}rW <:E@ 2H"Y/ u;X_x'#Dp!Ta% },Y0ԚhYGHj5jN>f/F>YLno1`Qpys7^Ų_FPF[%&kdjV1gS&ODfbX!1?KN^:S)-lG mY㉯duu /9 exKFknQ[S_X%lQ؈-/bf0-Odb"=/DNӽD¶KP--pQ {c Z .f^#R\6ӹv"GFn.RSl.2|"Z!S2^X-ᮽ춤˧{[y{[f=ۖt XmIK8*q̉;SC#Ga/κlvHJأJvxö+*ȷi\O+i<>'#E9֮%8B^*PRhg暗(f|!y_{ h?kU`(t $f!2֤nkZ 27Ӧ[DX2Bb#-gYyn1"#L8$ pfB}Ƥ c |{$b!r'2_B:$K hZvցq/dOEe<70-gl={5^;G+,VnY+'e<GE]Wg`~d7nc@e,QWsUalb6)1` [3M}E"Řp@)nO8ͩ%GRjX#V8LRbd>bݏ<_V4_L`#9Ӱ^8<^ `ذs3!|w(Qk0I }fN=xQ߲&'s~rV 4E~bewqxfwzaN &Zh͑ilV/lR V/k\ `Lo xMz}C_?2{V@`AR ů_e:'%&>ƶ;0:HыHыkt;0b{c NYE=4p#F IٻvYdFžHE.}w1!J$\B\pQ蠘`?@9SA- UG?MJ4kFYu2.RfEʬfV 9@C $F& % 4z0jA9# RiMv*i]pif)jy?rs0i8VC͹(j%UVOS^CZ#l OΡjqSEUMZQ5g9ݐrhH?{OƑ_!!@ Nan$jIʎ7gHjxHQ 8؜*BfXqBUL n fJdT[x3:gHjRV1!e  ɘʈ jxQsSX 4F'3&M7><@XdT eMA]"EA d2Sjl,bHȴ`xd?tţ4[ yΠQՃʯ7"H{|9.a% rf݃7\7wCň&R@_ |} >I3@:ͿY dAooapl9Lr/?|5Ow=V$( >]1wV,)wƌJJp?dP*,M U4\T@s0m|wnv7}?,hato ~(jJ'4]9}5츷֖wA3VD/[Q|ݸ$av ]#IuXgPEa$/Rgl% >$IGS#ID6? >}9JJW ỷF5Ռ K⩷Ȉ26/ 8N'kKkz:up oAh)j=G+۞){'l-\NBz-7g fpx\O 5Y{Y{Ιfp)M%k}.}q9m$8ZG4 BKaN?k @9'eCDu˞[!J09bv: Ԃb=[ƀ[{O!Sr&ކȍ!ֲ( [>{Hȡ'QO[yw!4HCg}WzIrK7~xB{Hh/=4r'6!nɹlGv%Uw})}9m|_GqzakS ƪ((E/HŽ"9;т^Bzxqs,<5f_邅^B.;dQalx:.2糸,}6.#Vww}pT+G½9Z )8*ӛn0NQY=:h` 9z7G՝&螣`9uzZ Gf>m$7./JuJy9USvIB*Iq Z% K'u43@POBRPKax[35ݮ:뵾F2tsV=Vd㲡ff[r"@Nk_ӥq#񓹞X^%H<>Q8 kWOFc%w gׇvմU.9!@Hz⍓?H?ݮ#] r~^ ͞ag?TDcE\f&*Lp}#|MHnYȵ*sք-M$s5.6=&.w! hw17g(]|xN`Ϳtf܉au}ycn n!΀$f7IN"Xػ{'Q5pNve6-Z8Fma ldiP"&dQ,;Q'EhT l:j>cWbPL?9F9$cXc4[\@V+ƭg\_z+Gc՗Q*;F^'G#k7wnaUqxjm$s%4 Qo T(, >wsQ?ٔ^)b`9#7_xqҬµ~?Ts?~vռ*G{CM-sP6Mm"sY!sµ59Y{첳/j|^Vvw8 cf>̮'ߞw<NJ!gqjLY%hM V;YC]p^b!~˟ &۔jm/./Zzr.=i5 _n//x~8j/-v)$}[qO|hX2u)2)QeM5dt)W?.\m;0ͮǹY3ܘE <$/ K'u^<$QEfRaTX\<+\)dII , fJ-|eϻ!gϳj[ b_t6p9ـ!5OݸUS Dox~5/ߐ)b@r WLz~z*wv}s JAJTfe%iBe i۵A$J=?hI1ҬCҥoϗi;ybb`9 "Jq, rf`V>;$~߀Lo`PFALgw5v>t tVԿs,_~g  6-YGSjX]qC%f)fӥߣ@ĭc^+Mx6:.~4 Mu9~j쾴/V[; ?ޭmv6mh2r?43tm];v<]e+KƄ42KnM{Z;Z zQ-}YJu Uuү5H%ˈsw6g&iNe%0O2c r,R`(xE%Zwi2[W0W8`PEvc)Nbo`/r=@דR_Oٕо M& [Ye+ |t);?IٯfΓUv*_`b TdȄ&80d_.zZAK7=8&RJhDJqpl_j&<Arn]&H[ڣmT1z#h5Ft??s kf>+D tx͵Z9wH7 JD9՞icb}M笫'PAv, &NCqkڤh*bO<'#L9eLMS`jh좐iXJ~@ӏ{ ,#ћρakGݸ#<9] AIv*-͍0عP㺡 e/\,nʋ4'uס3AFM4#&e]%v?;<*P-󪣫"68pL {e^L -~ HU28A]Yy`Qɯ@uf̭m QA=qpåP/)%ޣRp^:ty.2?f3YӌT #S,MS49ʴfZZQHHJ$aY0.Gaw|싢Jc_~<1F>G_rq+[y-hN,n;N,t'ԉuǺE3)?zUtH*20 낋,"]d2Sg(L"YLl'Rvw SCyƆixPP4A""ρ(x.(G"ЌB 񢐘SL-4 %H"G;} \g^EvFLSG4&ߣPsVۣד'jE,c6v! @= ]^%aC4>{2,,,뢪t }4 do /bPz^L̬y5j*o׳8tL2V5ڹؼ_kr I_\D#dJH.vWnܛ.?0ؿ$\9%kU?\m<V#-['/+Xnþ6۟ws ŸV|݁;N_]WWr4𲱯/y\Hsn0t,9{J:hUgr}ys=B&Uk;QZD3ES$1cRw<* !aPر'}E %7j(6cRKq1J,Ts؄eV$)Y"i2$Վ;m^ îLM:5[,4mq[D)gj4)yD)1v6j98TJyTejT_5G-*h*gCIZr'pZ|yFuK'џ̞)hJ\2MgnUқ:q1Y)A>8qquw|%4J!S8Q%39*JbB> ),$ z|0|05.P m컢Z1c\Ǹ: TYgVKeTh ky_+EhNm(IS[vjBLdS Ty(rBۂLwФg Yp6jA^ :)$1.sȌ 'R UPL9ѬZ;*pDRÁѬ&C PgI)yFQ 2kAw:>ˈ pEAdG`nVކ\IyVqA1/P L79Gmr0Ts8B ae4'/>Gr, 8v-נ8[ Lr g'&lC!yF)NtWLq8^{!@ب8g̡abr7KKL=χ-o՟&דE5>:o(W&RpvP%d8[)[^J֔i1L(U)jIVmCˎavz7C{_[^6|6z鶁~/'{%IH=B*&NmunS\ҬD4LTX}uUoy=w! ~*ӣ`zpFpvw{m?γ7?aÚѷCB x(/l98>:v},LJ?MKӵ%)HkJLHq~XdPۇq'lӽ GKt>Y9SpBx0mS!y%f~|f;B ԣwɣM/WhE=*U}ʎ EvKѕ8zJ2ރא0B5CiDp-_ 3%tܠw 9Rތ(gP8ޔ E"YVvB$;OC4x`>==\#dK4Ka2zg_p663XEDI[x͗57l}_A ښ K݊v2kch w65VO !о^ŕv,_MI&KoJS$Lj[uT1̶(k4:D~>'w6\ *b'? >b(&y{i莢AAq[ؿwt 1hG1l=_ raLjPZGla}Bu{uOpǀ ͡wlnx|<.oG=kl#kGѼ" rtM Ր(F[p{Z:%(퉮xi1%n./b҉ю6=R;*)+I{^u}4P'uklrQ׍I.]Whڸm58@p(-GV]tQܾO`~846iaY41kK6_IiW, %\ji=pSR)nM_\7s5|r=tm0Q wK"fmJRZB6 jI2iGEWW95#cJ{6}z=45xM_^ jf<'c2W5vz>OLZ=gފߛ'i~EBK9ȧ셮~$)?Eqa7^Ň29FjYRT=F}Kpuf.}FpM5a>ي 5h]av Wh%4m@Na^0Xh'JVOu`BW9p4FiYyxr3]}wކmE0p-NM[{q>|{s9̛~.Ax3o;a&_bX|v E1)4 y?uTZ9__!D|(L@w{.oaPZ?GoFG5k88qq&7#oѴyEeZ)(Q !dF ׂ.\)(?|* v!TH+dZԆZIX\_:Iފ̕`8%R();7 T%p&l>(pyV0iV^+Dk(֪B*'^(¢3(+!x$tp^6`Llf8`X^.m(ʕ)!QtBT/t,'+,}eIP?lv.; `d%zIQ[߯1ȡđbf8bJ3in4%X{[2߹Yn-Kkͷ`0BRG }CW8>~()`0k#V=IN]l@D&Yf`0 |[z墜ܵJXR U B|B:.p3䩢x> jɘ>7R>Urgzw駏x91 gr1[R6zxM_\ min|lXǥ2|0v7~(}ۿeA.9 (8`;з!߽lLUco'+T"'pT2\;EGw L;/aA-Bz{ǧ0(u 0I?9sԆwt.J^3d=x ,-C0H AG,M `Z:܌E\lBO҃FSBMRs8BF{LLϕ;0 ŒIHJKZ\a;-0kolݝjFtjՌ )C.1E.rߑRkM/.I4JL7%c(mt\п2^gYkB{s\3~@^w|ξgZwΣjK9[qvtƆ0=F#Jr;^=rԄXkyGf {^Kűw| y!eQE$zΪlfQ:4+C*7ml|B:ЫZpCsT{C{ ŠV)RH/s'l.(H0#b)%Ax'])ARW#_rb,m0J(C=!TiĝR*CMJFw5B. ɕ,%-JfNJR.!,MYm=f|mJ@PAi}0o mÈV}E%qVg m ,>YT(#P&x"(kV<;Jwd[J'Z4ϫcհYF TWrj >qja˛%a ^8v7oo`oO1,'DIɏ}#!??/st6)T4 חEA|eQШ޿)RaW肼{gv2Z*țQM]s;umM =[Yi5gPe֭#` +%XmcYICGk:0{]-'׽V0J-ظylo[֗]KByXAӌΧ"%b[2\H=9>߇jxo(@R% +݊vv3϶L\*Jud(g7|Zp}e3ݙ-fJVv _ɗ9f4-a5ݴTխ J\W͟$Hv8M76iQM+$U?Tb!ꆵ7"ՓEbh4,SH9IT3QCS'R`4"j|L ۩_êՕvuDMlz\úC 0On~2>s§-8mlCOn6=..oϧV4,?TXk2XiMv^YOVI_Y1 V`[%W Zrǵn'0*_%/zүl% _nx |yF-)GOi*ZL!b+ [XM!W?b;hu2fBeP\Lk'usSP֧VC|>G(=_ 9:Goqw NFK9z)m^Oum%p*Blhl_cY[znYz[Eo:ůi 2ӁOQoOC,/w>-Q3 FCӡczo~tm4Jsr4pns!\쾤F;{gv2w.JTS4Źu{t.d8lۦ*iȮuړ9)"7*<-\;W(*)!RH+ȜWΖ]LjdIBٝCU99]4I9徲`[Qsbu90g 7 $ZyԠCa$I9mS4 H.3c䰷%-\^そ-z$E:h]bݡ 0r7&;7}gҲ6j>#U+&_OvߛaUݥ[^྆h-ߍVު_c}z$R Z%>HnE9¯5*Z Aa+:8U;=M#8< 9C%ð奷ˀ۰9JnB'E;-?dɖi9^!^۱ct<c;R{ *$ -:CoCB^fɔ vTSh):Hss#޹H;MWrmH  Rw&?Q!\ 8۩籢#PK]+9@o=O?(rzs _nH}mBb_q3VHib[܏yrur/fZ@sf<$/TgGqͣg ȣ(l8DPDj4L1,^~,c-K Iܗ1O?3s"~^xňw gmoE<-fwlVfDp/^zɻK9 ty(He5/o5ˋx`CFl9^;g𔓶mc4x9x:A b/M"Y($%у$UE 庄i !#/ n!9{) 3Z+PKT1jN>^+^}B*[4yqj`o|KNIےkFïc_-v$u-;u -G4hNԒ1Aa*~3seYߧ័sFīՏ0fs_W: ,k;EJ)HQY"ҩ& Zrg$|?ANCO³7E'e:\RAnLS}.\waIFto P|Ou c4!R-N(J))c*,kcО+f—{Aȣ҄))IY5lc̕rgHtڙ"i XR5 \5诹!CQ=W4H)k!. Zv1h%t?x?FB 7cQn& r=vXQ(o ' [ߍ{!nYt9GEF[88)n3!Gk !헙sH2"X. 3`L9m8Xq#0)D $d r53SF{eLBO>5s2)89HPJ3SZMˢ(D2/)s!h!0`]RfŖZi2fIMG?m)tҒ{Tm.sVjS*=ȹR J֒$naPT40F&0#^(dɥ/- {Ή΂0+ <8@Y`13K:1F;Pt Jݞ3_(2yy-y W|g!]꺺~JN_?0֘p釋+$]=1zF(`ۣQy1͗pwyyq}qwIK3 ~fǻk?6b-HWSLtJ &-]rszs4ex2^~8>l0Q|oӃ=t~,:~tvQ6clQlipn} ؀M-pHIGgP)B\Ceyֲf0B٣f*(+~+w-ʳU4t4䚧3k,ٞ&RN3pᘻ`%7Q&q҂BKuSl"AK;푠<"YxqYAY%?brv^nTђ5vO)XR!^TJcŻ:Iox@jM2-QH Sym]=3# O| vގ~N!VW!F=|߾gIȽA\~M/A?06 ٙ ". b`q!=O7H8Eɕe[!,އQBSWXJ]x( Z-wƂr[uӤUDSiplI7Z3 jRvu*Jep ߃H >PW: NJV /H1<~kP ,2ɘ"1Tw^ v4Z Hu%pz BC[:bޛ ۜUȣhotb4txք#N:~rn 8ދN`6Lԕ,RN9@=H#:m/߃+ la # |$c'9;I;SjwQ3bEjʳ*:l-OR|i߅z)a0"MOrxƓ0mF6'ԵwZ4)! T\mKt(mDW,揾 ⏾F5-rvBxqdaIv&mǡÔԨ&^iCmƬ &zbtɖbP/<@Cg;rxYM:}OФݛtڍKRddinsǫOggvsypquy2{AVFU@+rUI"dwu|vKgA=.ybǛwwGy5b>7[{6HdEWuok}X-8)װ\](ˠ3.$(,,a6)f3^CDkFz6]n1y9nJy~#p}P{ӃCXIz^5^4%乷wA~ ~Rvg﷟ |b?CojM_ӫ ޜxȟB5u ćٯwJs@o7|92HVǿS 2fҿϼfo@X[i7Ra7snx;,:czȬ3*'A9Iag繈Ӕo-a{6~Z{XiIgӓVn)5-{m ܅99Glz@d /%m' Q /Ҷ sq*xrEa^wveo隗o iztzđ̻:?QM?G67 XT#飬y¨8YBy-{YyVҷ$1H1:ր]yKr$`uM3Bmu}s)+ӭN+o:BV }LA # tmp=lmAڇ#5堌MThXUa79z b^;luW֢GH&N>W(_bn*-cEҢ9 K^99kW;>,%kXAb _RH˲ȨM";C RlA:@q/爜h 1=tMd UFvڥ5 hvpRwsvHCdlj*\EdZ3F^ ;pz@Emj2) >KjC. S X% lihPuKzWq@x;#hԖ#RжfñBT1/^vDiX:0igltQQlG]H!܈LjNa@rZ a}iԆ2R8kopBfm{>(!S'[B[;؟dc?{ZŔ3xjCZ :}ٴk=APۄz=%(5k@{贄 [!>*-!tzei㺋QnzTuϗaTkc3A~ &?dHQBhme3/yˤyiO~@W !JVw)*SזbP&5QR$h"9iG]O~P}_ 'p!u2<PvmE^XCu؃Eah,=4GȠĬ ӆo/$['NVSt]RI7V, K&,it I^H_gW Q7Q-c5 lHU F'uB!:KN;iPʤH# x_˅NK5Ũbt$ot?h#7}M`oc:lކvޘADŖ\JYPV<c [Xt&X֨k]39쇓J.іDRM&vAEoUePd%x-dScdZxrXGnzb7 Jxv]B :ODl\l*H5de(xWC`aGt.";I1$j˪!=Cރd!, #5j օVMܯUIp0M#L >gfg!b͊6 ̎/J#8lrL`,nٻ޶r$WNRH>ǭqK1C(!da)~@iN} _]u.0zh .zԳ@y6b>[Oju@ޕwei]Yz]SE,\5mؗB` 1 iVN#:\͏U <GM(=5 dFs0B2E4`i 8:ls@ ))D}1Jg =\C0Qk%X%cZ؂햶{ʧaY- b9)iv6z'*dd1"\QF( *jSkQlx`l6g#|;GKxɁ3RMB  ))`کGʕqЊ,K֥B硬Tp ZM6 Ag두H1JkK|4!SgCœ4-0j<]WcSx(r fK\yyąK ,kB8RDµ~W8nzi 3LNўgøܲVHqֵO"sVD~2M/5+j4MӑJB$ o%O<BrQv-3 se{^ J7ljںaFm$t!!iZ2+ Oe8nxK؊xq)8͔Aœ!É!} N-Vi~[Xn>.mwK1I`d|reDprDzpA@ 1^uzp#VM*5q` -RK+Il[Š][Rj<rSQ]Z"zqFŇ~ouU륷f,s8 M(+CéA fk~ P*ߟI\ΖUUX-{qGO>qw&gxGGU*S&&=H( 9MBG)feR2˘M0!p!C ;%$6]v`-hgD7]N ;+VLCD;doDlJ p%|㏷?K`i,t9h6gΘimwbDH/%j+)=dbr.r"IGy}1I `Pd[#(RRLRL#3鈎HT 2u3WjW?xڇ;Kɝ'k\&fN&1O1hH}٣fFQB'/K6m4Yb>* `J͒2HnR*_2gaY5KcW?;}>qh[LX鷳1R ؎'q*ĞG}=ތ--Έ y4I3E@sCo`N3F y8&`9Gjlʿޒ|9"&R\p[JD&bDAYk!n299Rx =l(eľD<>"v$?F~9Y8[Y9z7kRWaUKXXLW^n\.Cr1NnYB 傀{=,._OvrD)Z)PCNzz{s}.N_+(ǿ4{y*(dM oTs5"F uLg:Tڃ˽!Po8UFŁ05O {Y{B(N$59]Z] Ke%T.Alf~l'?X=4JGmgW)5*A#TjkGJ0{VFPvJ %P?%S%HPGtT#('Lk̓GY]?L%$=(LL!=.=h-re7ϋ_K7Wf`WtiO:=ݬP:-5;wk~Vl;wM6ӭzK y*A|6lޟ_poi){zlzJBy#%")lbsϳP#ee.E^5V*YRJO|fߝBf׭ޞ߾?X}/4[/w~KOGSdVOÇÙ2V.V}m~z ,4zs}>akts(s itgUgnSتjum:ԖVȿ - }k<+%//feڻToߺ6,c(pm]l*[l3 _!<:z4Ro\81N!Dxm=PhM;R66gn G1kP.:Ԡ1@t'R q*"֬i1Si?+? ͑HO4#Gn)nɁid9EϘ2"-X(7ߴ,X5Ev`3F.,WOoKBm>9Խ{q{3(.Y>6\1VڹF|FWO5ZWvѪk5c $,`!$l 8yLyLBM'Vڔmz sGC[ZWk}tJmYΥ>x7CȇhŠf>SĽCzcXWnM6e7F.м=Jh"Gom'OjZNQipϺ4P\2Vўõ$XMvrdF cQZfݞWU^@ -f>ߧbxµ-F׮DZ++=;?/ˣ_;r򩙳wKC3Ǩ\zg4):żQ&]uһO;%q~9I":ʉ %t)Ѕ|tY)L2pXïѨp_[,h+ٔҹM&W Mz:"iJ $RO;P*dcΚIik Aҟ୉9Rڭ|$x>MBvSf(ứsry:rR?g'w\߷@W93䰤gCL9c/0JƘ@c4AHC7h O$8 ¹@<-x }L 4td{-4N*ZIs*%+稽+gr܁s#ICP 8:#T) 2d TvZ l ^ePxߟnq>Kt@?ygS>Ljo)\}}szen۩E(y.__)t7O/Ck(]Es(" 9hus6~%}L~0_SG?1ڀCV:-C2_?̙szwQ,Ck]\M-odĜ2%_sÞ¯JX(4X(8Vx` (W:Uf*}]&T~2DA 29l||D >V~F}DeH3a<ҹl[!eVuw?BJ$*`s5=/3BfBʏѱn8QФy jy5<Skx*B滒cY]ޱ`ȵ34+O-yuswKoWRɘk \^S)&a$x-eDt2xav.:ާGpafqt;<[$]U?=jIl&F' T3q (O䟳>պ=ċFO+!L@ .^(i4%ah  *զ*@ mz`41!K,ԞSjis QIVJaDztbB 3ۊ\"Аo\EtJ;TS!aujORxxC9w5R53e[2׼hN)BmuS|p[-9Svu?ͺwiZh7u(v4/+JsF8pEil|^SЄy%Hcb>g̃b5YM*d>cCL5jܼp0Q V{0xrݳ.h~ADz5* PMԈKFU,Sj4ǫFUaHI] 't<'Z>^D$?n!V3M2Uk&@O'aH(lR`'iɥ_QU>Y#5zE9Q)D"4J<B`֋sR\32NVWVPm%d0F1$GRUV0N '$uA<"4r]^Z']VD#5 E7g_ZC/Pt7ŷ>ُ)_ymO|p(嫈=-X*~Z0t4tH)⤳7tS #`軒y-eN gJt%)9zzЄy#YcRQEn] -4ר2D\fULs?"GJgKZsA39#b^e2N}:|ʘL%EK!ꤕ13K!1+|nS -cو HNVNVϮWu95Յv[?ںV7v{t'7>$cw҉ +^v{^{ՇP>m0(K9:x\Bh0q3'4/g{5>㥏mސ)?wJPKnnLjITHLMH3U.`7t f uZQF 秪\2Fk )ɼ8+9q`џ_(sקGć|yozj ,sdCH e1tÅw*@H,aף7Ý_ZOUuo^5(^ Z<$#YibP-Y!(h\n@jV[|T*mBf$i#U´Zt3SI48KAB)Gbԑ5CVSe2k@ϢZH,v)(:cl}Le{n.4%Qah.Ib'F& S>zjЦ(řis-Al[ǸƆrv4=1Bj ?oMϝRbt@kݣv/¨?_~/.e^7^&D;{ѳj` s>ݾ9YqOlm:ןV77◵<_+D!`s(' I@k]׃+q-(whZ-ْ\xTGTl&^ 4 깼xq}Wn^f/uG92;{|Pil%n.y~mޱdȕZY(-["N~pc˻Jp#C>UqsU1T_3.IM/c]*|Mgv9uw?M$)5SY.)0Otm ̊ XϼVqM1(=^a?9YR0^SO4 ^GA%_ZN1]F9Cd%n6xQ]=j0I޳4 jj%1EFy=8DvL|=! `_Y{M5|k1+\G\0OSw2[_Аo\EtJŻ?;MKnN;J]ZCgn))֭|*:Zb@[Wf=3¡gɼ(8[BTݦ ^?(RJOA?nǹ(9S$?1u;e ff˺H w,rm@w:wxyfwxY瞿nhtZ"QPb bRSr:ZHA?el д~2p4>59/Lmf}舾 kpלcme^?:҂g}js Cr-]VňZ5X/3dD*q5*1#zx]Z?\os\F)l!8sJvX2-zA?&`(pIr0): DS>I'RI)D!=CtIKW.UPģwxfL%ģz]=h!߸)-%GPBa(p  aPId (.ꝋAw;֡BjX|˸p[J}F!8 G +c&&"!$D9Ϛ; A^>_>_. jz9lj IXXRO(,3[\: A>vw+=NzsBv\=)e4_`\6-2c)L$OXDI( >M$"Tbx+$f Z;lλ!k:e > -գ,؞9h{[m>m3l )1 ZXffY1dZݜא Ry7% b,~S% ="?"fY,!bO]K~OpJ~?'\AcoDwR9qbC:ͥu =vkD!ꓞ.Ŧa( 'OĮVg/yh)V$3:'#AӅz oO03F71q-\0]wYݹAB^ N tGgXt\v&IәQO^M-S S8Ikg\ @S,ᆢB5PkaTH`v.F,N03F.A03Fɻ9hN[tq[_SSђDG BHLZ:BB8Aqb;.G{ƚ LbӬImC+ZQz:qd`IPA8N;vN%یBY%h<I8!iـqh1B#28Ѹ\Ԇ-+zT' pJӆU6㫗-/96k0J3Oav~nfn.#Ηx^c]zic[}m 0Z֠Q2kU^KpQy(2QCs)If.c˺QWfIDu!,%}t#b %)Qr+kOXwFqg48-o%aʋFaZ.2B"ڋ}k.0V9XŰ2Z`-ia`P8{~l&_e޴MWίWh i,amZ5H l#FoVa2~*m =#Jlf GFVn̾-u¯0qB*DQ)2<%!R90WKR0,6w&ʷ}r,*Gg=B@ғߓBztɆ^#;Ƕ@YboiǹFP@qlvQ_7OFu@>\'Ⱥ=KHLɞu+Θ⧖wN߫2*X*{кld|4?vqݖ8h||}{& xQ[ g.p;-0}/ ٬yә5o:Mgּ͚ip+4Aac fi[PD,Q#{sfVɿ:z^/V Nj]ӧEKiڣu*aNiFj~s%/|.-gf$ X I빭LlvDvWڂ >kgko; \Q?ƨ߃.~{lW'ヿ-k7/')2hYN3{ b~3{yzsx+~f_O[%yllʳ, |Of J('6* ]<.DEaQݯJ"8B`ۏ_ӵwlNyGނ}i#I\h$t簬X25k 2t]_pw~,.HDpTӫsZg<Òӫ8-İځN5xO}ei8h؀@ Z[1"Yk <pBp}9 DLZN NV)\I&&` 2k| `$5;S9,t \00|@$,OT-&-KL֨#*M-cu-|!C6G#W'@=+ss]X IIKuB5Nv8 `0OnAdkeL )SY ׶[)BKRKnmS)ܛ,a\0'LPT: y{?k[-f2:&S{e` 0._寭Lb nb5Pg0Gox8QxPXVi L%(O;RSl S5G|N>Z1rOk1J 6G:bl/QFu0|ݮs9̤DG)¬7oc.wjp: .;iyϺoͦn&Lq3kiL|u3¥\߬|<ً^..y13zѽ߄ů@eTDa}XӰ_6Cz-#!M_a-HRzڻaAy2hLQtvN>|]qج/>kxd0Y./e0J}?̦:5"Z4yX\# HB[xϿy=WC?Οj rd4>8ؠ2焩wD Si$Let:wd$.oLF!o1Z“֘yn];T+kH]xvc6̶1.,=ħZ`/1p70ݠnR(nSPxM00p`|zJz85n9R v<Q~Q}&,8&eݱd'k3{Gl^īWS]M|š1S*-4)dWw}_u?E߀#g|zX~q cOix qۼSh5LVGH>'܇ [ TcE!2lKO1DF*g`Pu< l>kf}z\H2 \R~bt%+?6j:Y_3CYdWVyǻ)]9̋Qwyw$uZ!kk˧p4 `Wj~?*1T#Q/7hrAl.T4=FT-^qR i![d( jV`P-e;쨣Z(܎^,R7֮5%0m_Vj6b$'_kʧ?Rk^([EJ(nZD _LX4SVR!9S$I[b{Ҋ֍E +}`&ƮZzhB 4 Aڴ @%] D-ޑ *hg18W$w(\y4\' |@Lxi}U; $=7E#ϛtKG(ۙ Gh=O>9)d#V'=z<*GB uD`DUCHIU0R\^iTcfTIv/^5"_J\ *gޓC?g]*8 M/^' *&e8$")J^v?NFE Ϧ.8I # B mE SO7!^=M|BG=C ƥOgWgsPNJ=6!((Mytla%š⳥ZZU>ax+n:g_[D*N_z^MfFL2u-9=g5ElAEy-Ô̙TV:|YWU(e]8}A$}+_ʾ}3W(v\МUΨjƜXԔ~(bos&9pԜPR"EKISi1qme^HϭAVvRސqLuai"!iħ1+'61jiuU;CӘ# EF&>) h' o(A<&>@F[ }csM$ 7 "JRvPxNJykռm-kŪ4w Zvw΅ B auBEåR1;SPxJ+}Oy6u" k|VÆ*=;9Qƞ genjG}>hyFWMqO"[rL?zm^Q\~\7Vj).O%|nY~5Inèt PՐ5.*Ҕ2¢HӐ$G"~HbҖ1N)e9y;"2=vAf ITlq~ M˨WO@(- 8)Z7Tbl2ƃE2N(a-@KSڳ%,Fl_%OF޷I+d:C"Vxg*dO A8Q'u(pE)}r}b?~$6$#):xPd/EDбQ^ZT]g9/?`X]+e ugwtĸfPH$PғL׵@1GD񚹆u~Z*xp`QsS,}'L+,`2hpGms,-c|XbNhX-*kGt RKT!.Y4%.3 P{LP̤yP qFNK,'k+\',p\T><KˀwFaqXUCk5v6[x?ҷx)%n% @KDNP v)bD9uyڱ6ä;SL`P*dV|sm5.0Ք!j/MD ͝˜{~#B &usj.I``Α6w-=ioGMކr߇}ZYXЧXH~yWMRÞdmJr. 1 yI]᜗fZ{I*{jFLP@H:\p/~5eG>}Êro/OHF z(Ցb HJjXD,~'WaJR#,B1HGjGHͻywlvF 9;@jH-铸gʬDB>=ԨwӋ~e/AôБ﷧g8UZv5~54|pjvl5`Z}ؾF_Uh1VW>X ><$w/fs {Jd^\oGӋ ?lB=~9Ku:gN,UtOu=Hbj(=R{K$pY1XFG وy?gkdW~Eqt}v;ImSe6`v*/2ƾ{K3ZJ߼M>Jk)Z1[Ô-#IֳnƑQ*7w곔>K곔>&Q,K 1CX-Lx紉{V:|X"X[`<܃%T[PF*G;o,yg=KYw N*L@z6FL 6h8=rDwopdcVw?KX:<'Y'uI*a;Z_$EAPJ.0Qt(X0xS:pDZp""8ANF,%qg U5eϒJӇ<JS,>郚捧aAA=A}>.8-Ԫ(s'Q(BtL bTXC\p HnM]U"X2UL?J'CB8rDyc>vjaOFN%`>oEAMI&MY^P a#W1aILjkNQ[Yy1W c=?d֚mZSHtj 6}tqbOǨvb8pH (" ,P Jœ:kZPFZm(xʝU by>QP DYTCZWaiU9ʢ.=,eiĞ 744ZBDjFs 4QF f% 1@%2) ޔJQ Q@zDx;Ah"Ƃ)U9ʢ.M,eQH M jv7aq$Zy2,Qcp,8a٨"5 ho#QuIR"— Ht;EfqX4+Oiva߽;@0 =Ҹs`7(rLڪa DKvd #oGC*Rw>Wyku~L)`YDaYŪgGauHfKGm5,kARn&:[K8*pع7}~O4oVR2? -FV;0[ ~ܔFOhiB`@IfogB`W(|s0/W Y Y,^&6TgỊ4:[ 88Z|? j.nae Dm =xWw`|TiqvT,͝܃o,]\}'؅wBe$[@{U[Gkgׁx@>ݽ%9Bkǔ|,Rz ZWB}[o5G)vFLGțtP0B⼇pnH #sW?ne/r5jg/-ެ=^ Αmƺ{s@)xwR-yJKZoô^> XX,3KrfU;`tVMժ;estsa;JuP:w#՝`tQ&Zs ql8Z%j^r:MJ}ŽS+7 B):sY0$3}oפ<~" au IjG=L\m+rZcϋ M+{^lksfn{K&!ϧvDZS$\=x.CML#$4jSc JpǾɋO&5y~mR]E(Ʈ9aA>bJ"IfK>Qq4D1eܞU1O vIdeCšF p<4ƚiXr!Xe\dM=5O@޼}IN.MJZ,9=J`?-,DMhQZxm5qeTv)$jΔhQt;*,7ɢA(MI!vU"U?p;U"W&&T9F9±I?-ŪocAʿ _֗͘Pޠ4k"2$qhh#/y$ċ#*oR~ ܦ%|Gr6MVNsO\8M,Jp:};G,L \ WU.T5˻>=hVk vһne0"̯a2~`>{11`aG0αVڒ@`גw݌ɥjvLkchÀ8yt;kh6=[VU bdc$D&Raeu]EJw^ۗ//z-řhmݭ M rދ+AT4̏V(|&_^ZkҌaU>>Iu22vy19^ˑ߬:ʘ dT( L46i 4^@n9%7n!)!/)D ctR!@D1k5LF-g:PEDci924nuϯNJԣԘ_{<:q-GMcr$5Rv^Vo~qCH:S`_ rJĐ2R屉RJŒ hCaZV&FZ$ 1jT=єs# XG dsCDd]ƌ:a@pK$IBçNSD ޚ oePK*+A-ʨ4B+n 2GY2.#Axfd#3 &ŖzP[&{BVZ@o(5H6*~UAMQR/ XH$p^򈰵sMӊX*Z#Z* EXN8 pES&8S8")%A|jCv+ %cA1G \, >B%rMG$Y@UXV z OT # }&)|".|@%\[jAQ--H%#L:@* h7Z vk)|[aCƒ-½A*3 /8"<)>BLiq>9q/j&/Gέߌ3|gUw$bS!rĔ##QB|  3v^}J]ZKه#♥>EOXsg+A:T|BR{ 7˴ XL+Gj&R*Gj\^5q[VTC2ѣSDabj-c/WLÕOZޤ~nd#.=+ިy׺ 8VPa eK<08B|hLdVk]KʼnjvSh8p 88/cHe_ |Ϝs~ 10ǘ9Pޜ_JTL kHMjV9pZ'-UEuJ|A5gP%[!)bϯ g"wEp|]_:&sWۺEJl9FLzwȫ6UQ+PR8uAC}fPH:C< -␨wBR~hl)i5O$HS^,H3@-sj>]zIh(|qq}/?/+4B^yh#n% U2#% ΒrNG>̹]A AN+Ak.=!ya3bJ-c2IJ&adYf@*IJ>,ϒj+TMHօ*cE-禂ۜ cu92}͌~< 3>Ԉ$}p8rJR5ks-I4Q I67ʌp C5<UJk_,Qk2~ q60f#h{O^meP/zƽ8Q me}z:ʹzCzFRSgL . ~L> W& ^\2t_e|IZ8h)=C5y{&|T_) /'I8=GIAK8dRI"ǪE9` q|]Ma=EGqaB&9))HE9Hx ަ7H:ѵBA4*^iW_`W EDjHG؝FImCy' yBhs*~aqNf+} h[w>kR_!xv믿gMmAGPI̸)u[2l%t2} X+nT?>SSRB_&LxX@D(P]jT4E j&WE`(78 kR uJ[,1V,M@ Mt |w@Y2np z 6f{@ ' F5a):pG;^V~4Dh=E3sTLQ6],M5uZ`/ ,a 'D2Fkt刋,a;JzWʐqz sPi|1$3Lϵj%RH r[1}hz7tdd2tfVmI>㱳M#:(" Z[,v&*#!tg*Cu5*\W%-WiG Qy_{) 7A!9AΙPm7,F[>ᚲ2S+YK𬦀( a),$s%>PϾBiU#`Η1ʊZjj*ap;RK*#QH3m{i"ҚYU0R*")YsRhSi<5uww2! JaF*͙+;OID{mm 9G^@nRtqAlsm,HCSWB\U8mJR*( gLҔ J(k@T:] GCz^#5Pi<*!Y_ ́f_J.by-ŀYSS+&:*fFRj ԥ8xJkDC't4rPsIUjQJ uRyI'ьuBm<*3x(Y+!܅U- ?;[+RF8+nDvCʫKhҔ3.ܝ]׸9EJX=QDO ZQjkGZkLpi8\Ui-pniMQ:h\W`Z֬)Ā$C%CG]~W3h~i*(DѮ㎗~4|}IcAIf&Yl"<_VX"Y`l"I.d;וKr ߝ耧Qc<ќrD0|E@KE"`~ޠ@ǜ}{2i}]־;ATrkunO6J6 LRJ*4@rRg`k $Br9;=?LCֳ5d“׫ݙ!7>|d4Q _7'^]m0/gWZX >C!NCtlӣp4"~)ssq_6"όNxڋupu ejqFEq_K]1QTΰ HճճI}yv3jk*`{@҆[z2EWK}pYn/W8*0 Uܘ۳=yjMVE󗏗ʞ&@[_^?,EgR&쁑Jf::ƒX{PgE/*"*!}?>xK"^e\ӎ A58z;pe8PT6Ä@ۚP-xlvZ%+@9RXC()]2\t=nf2ݾHS/bRTɳy@4czMj)7XZZSgp QTt ~g (Q=%bOf|CZcݰUjYRncΐӆГ~Bs#`BO %=9dVqr{?.}$^6` 6`m7 9 Ē(k+E$J.+'P az>4D8%:]n#8"wŇ/]S)(†~&7MQ_ڤtC|Olٍ&cJza.O `d{. 3)ߌoi?uV-~~tLeYC+ƳYo^rƀ ~5.fJE~n4>Y/1!ӁN+f*&x`2?&iiS(iHCy?;#gTzPpg@wХK9;-FL*_SK6 WO.wOc|Ӑ7G2 j0]LH7iD%pǘJJ~xp7ԄPY }.\氮̊Nn1B~/Te ^߷)v/s[ưW6L>tșu|J_ܨ~X8CG9zEd >T+ ?0{S0?0 cn&Víkn;F턛0B+gTy) qN3&( d5J'rkLg(2BD-|Ƀ`f F %j~ EPtQr h?Nwݑy㎗<*4D r;dFVxxb樁S9{S D g Ȼ]R"p3ZR)JQJ~.W5A]F5u-syZ[zl ~|s]Ow+J _P(z^^]W.@n;`av|sMܟMb\qO&p)WnOY4Eׇvn?ƥ8ƶ6x"yH<:w$t|#JHW΢Y<%AH(EP~գ.rEwBH=]DPt%&9^ë-5m8G)݃lY p98AŊ{P=9K{A0w!Z؏DS1$VhQEt궔$O޵yddq;HΜ썢pDCe_9!f"7{U ST=2jl'3^4^ڔPQxVd Vc]j&EӖ)#{o%>SQ$ؼ +[x[qQt)ݳ;6{?ji3 3 'ٺpp: y/suj$BqpѲjkTr%N<\3,(誩oZL1aO.)1L#B hiY4۹ėAb:ctn* 9/}+41Q!ExC7{Wm/mCr|7Im_,vkJr6i~lɒ#]Ɩ ÙLN`hƯ>J}~D=uOo%(*%1g,6.%t 4g^> \Z#`0&Toe_ 麗UEOխ_$;q?1@bJ@lx'n[ŵ\UJ4ю1Y5|̯ 8˯;#7)\_R5Uż̛W^ d @p`s?8ŸfvG8X4[ˌ0ڹgsI045h9L|f.{D˭/4~wӷȟu5O-ƷSs51xh 3'\/~xh3.fq*tߺPJ%LuTqK9z!JK+\kyWViQr)%Q7Ȉ5=R-/_j fCɟUy$c0Fp4G; "*b_0l;[!w_G٤cѷ]߬قF:}OVp0GT>;~k[\᩷VUm(w}Jz=u0rVUNEKjt\ZJ ڃę\h6x%gI捗\{'!}3;A> Zk[aKP{;l5U䒋CaHW9iZH< :\\Ds<}MU Tkj|TIG,sRA}199${^W^iCB>s[$5<~ܟRLaDi.V'O8ʅPxUneX Ec< k7fK e 2J֛ʣx$p7u_yodiK3RVG υbs8,pY"* )0ixt hhWЗ q Cn9 ̾ 0fpC pIv^Q<H`:9k5FgGɦPT "jTT|(1h@B lνgq"N]TDB}8X!y)'cԊsDʉh0w j$P -@L"%\UI5V-pwda:`pw" Ҷaͥ!ZB=6V=M)E$e-L\[SSCG?}r|zz\庑QP ܡ9R=y3&/Ww m/iT^-~jÑ3|<#Xdx `m>%;'Pqs *jA"Vn6]qwdu2tρS7w-4P T-;r!*ΚWn ~B /8 GnCB v3vҙ%,{)%sFaʫg⧓\1 yUd]0pU2V^* h䨩R^F_^ѥ3yzK/3Tųk39l1f\k3)& ėd B8;,~vWo~V,q+ wJ\Hdrbx5NP͝ Fp,ͼ|YUeXK9a3]vFKV^uS6fi3. .Il%X_RFh4DMwBSЬ 4hӺy$ndg*ѠE5zC$pFQf 2K$"-7q--@*7Zz `nى];F߫'=mҍk9pG^ >)e!rLPu2H.?BW&|˄/^֚.s,kM=\Z&|AFe»qr%5AKbw RBIALMڥQWG6*N Wfj&)qV[^%f'ޫ+GYM0 cl97dU cyQhfmNM5Q>SapnherSnQP~"z{?gjf7KcW7Z)qP526i_OYbi8T#㍣6 zZrWFBZoHJN&5$i 6Y DFdCU,-9#j`֏54ʍ˖XYzyM#P [So./Ug,MADJTo@39*/9A4$ОVD@xό6꠭%pzBv)* JAJC3fӹt~rTKR@2谒ҊzXCGP çlI $ePl,& zܽT&θU&H6*?Tbp$%"*x'EL:`ٔr"<xs!#vSF >5 0ʆ=m/h-U'e:au}̪Ϟ$H"AKCsMx{Ix1_x U!l{֍`9 -Y70A34iSdž]ij0ӛЙd6hE13N&8;(Ue/G&tBz5;Mx|d$rkя1ƽ"(sp{?{3?ee@SGK~:t7w&P3ɪ3duFWV+rK6ʊ`rkx@wb[)Sl0 ʱ09yhE9`h޵˧\HP߽Ȏf6zrYb7W>$I+Yb.O(ԭ'>\vlKm*AWTuc\-Bx]"d8\.Ex'%׃ga"ќL,.3ޕ9)$fvbZ=eA%{>5WAy닧Wzr"8aT_8GVQao"t=upM.bFq Ў-F9w  ߜ8{PpCNڻ\:(8/6[.:99Ǖ< VC'U0$ Sja Ht4Eڜ7ue1Fs|c06gbϏ2Xʞdѫ:AF\Agn֗zU֡YFpx7h~={8WMJmc?r57n{7OA0u0֋7m1cj//Y(uB),{T^eF+L1`n^5Ήz&!_›ӝ0߽IHrmqD5؝A̟^AkWpnÁ9C7*}۝Цp[ {q,8Dv^{cspJջ nX  u1e2cyF_],m"vOѨI4<T޿lB6R;y'y7d|]q)RL!FD DpF)|"Ib jN$藴5OѿNGoSn {T:h[ N;œR`ۘ`=)u~.ݖRxj Zn4MGsh\X={~^>ȢP $=kEQ9ʕB"78c m4{햙VLsm=Om=O󦷵(8RLPET8qf{! ؉a$i1~mqYC sE*hx4L[ZBx'J P0W@tZ)7{~;MiAGnV鑿kxFn3RÅDp/y[ǎϱi=қI?y |gUv/F~~s4tȻdz#WB\Eb\/<\PTF_>?fZyяكޡiݠ kR"`T?sȱFI"Q1\BPt'ɵ gG)7q:;*i R'7;B{Ko.]Vx!s,e#:o8U!h7чwN4zBAE@vʄj,ޥI<;zy{K#rvS ~0lvszrQ%S%%? f#0 7 ~s9=8^0ju`(pwehT~o5bwSciԤrd4ݑ4$oBK zSVs{N-=t&Xեuvlg$rB(+C1:1(H 4=F,qQBؘkq"pnLqj"1g*Ўq 3˜!ݡ\ j5oxm5AMFB :X##$Ճ ]#:T SIAԤY(6Y u@fAJ|@t+A ed0R[jK\>,`JVn/1X~3{[`?C؏E`kdl}$#`1bȫB?ǫnhDL?꫃p_XȮS)*H9gR&e[X$H%q+g [OSph/(3:T0UYxeo,{z9#lZ<\<=F$5XCL2 WO.~Ĭm@֊ Z~Rv_ hI,<ڪH,,&bJn+f16LyJ5z2۞FȷC(7:Uof[3@#_}%4'ֈ\ H1+ŒuӖs\ѭV2@{xD3Љ^W;=9)]3v?j_c+E/~Y2.:1Rv_ Ò끲_2)^rlmڣ*j"]Z͙#},RPaeb0)ujM=:4h3ve}ebhF ,\c"zSQ  qr VfWn1W,b1tXjI%i!c4–I^Gc/_n^4Uxʤ|a!EY, : YXY/s{[8JB ߦE+1 7w8jDJ*qjn^}jNQkpc `!,QyXEDF*M9Wt (=Z rM 4e91Hzax_?xK=u|ۄFp^A;rUZI@3Tz_"%&6I7jN-2](՘627ů%`")#$iϾWDC+t6WI6tRHZr7!O3c!'?ks2 g.l׳ƽO󎴛O> # f/gQyG!diK[y& c]uyNCz(b?f !t=U:b &E4|Jy{H@pm(u%>%(%?s)˵\ 9AC3;JjC %d*% J0PHfаb@ %/+'I>xX\ɘf )N+ ŁMRr–+I8  rj~^J mp;@p^F4cX4y-qH:d@e"PKF/f*KZ'=+3tݸ8Z \3.D^ ;fѤ҆TA)YupISҬX9&;W 1Uӹ=IJb|Ӫ̿_헱JtA+#B핻]~v|Ȏ*ܬ/^?){]d_'l!]xP| +W+I.%2%DA֭n<!ڶl{ %3`D/UVX :8>ebB@3~8 !e6I"t%91}m^mùw+0jݹ(D܇!wӱ{>PgI0c!T``QI |;PE-ԩ"/} H_<)Zo#"rG 1 PXǩEK HYGi:yZJsW:?XS*yC9bϻ[؅|so ? 8Y㡰?ia !fxP͢vAuh 7x@"39 LwaGh$."06 5Jo,A,"NV} Y){3{Z.s .p9zzr 9  -FXF $jLCrzMIw4ь8c" SS>ru=kܟrޑ/9}W'{]\/~MW5AQ%ZQV͐ǜ/y͝sysypP"a2 gYVOO`Lf GB;P ZuիpKPkM<#W*ĊKib{n(@0k^`2K5G(IFIb2QB3zM"(wjp>6:%zN tsy̸bH c aϰ5`&(aCpMt2Tx""iK|6E΍zb QD7F)݄"fXb`?L$2.ӎ8TUJ93Y%.b@(A3BkSo;R*qPۖ[ɕ¿qZd֤V@ /?><.d9JdpZOd9V EH.ixrgw@VJ)]cu;xJd֭\VCBp-S Ōrs±ss\=^sjMWq,B1g2b*A-f0oU/-5rݳƜI(IQKHPte04*K `A0Il^3~.Y=}\;f=[c2B)grt63u*CpT2 &*lzR420jx@gwܢ}U#'=6n  FRտՔ8N1\u cvͨ ~eZ)6lVeXƺtYR ?]~67p~9e̔ Jf=b#+qDA+@@ⶴ]K1^k*o2${ VDZ]JI#RL k)0ky ,8KTq[ڮU 'g҆VdG|Ԃ#ۊHC# HcH9*=˜ JB] +6+(dksjxZJ.$=R 4Vx#P8m 15( } /B28JiAt*!e!bnsPyqu䑵Ab}g%1wATg9b){xC 7Bt9gvNz;8Vp)3- 50|hg Zyn!41P4lip> 5:׵r9ord-D^`,SL .ZB1{f<漡cD> 5UG>k:N3Py%Q:2+bˌ7@^]缡p`B069Wߗ\^^|q O2f%5̶0iV<1h0,0WeL^|ND)s%*0h`3QI.Z!&h28+^(*u*I:puk#L=h g#9O“(eK/ኑWӊƘ$wH/\faRPJY&F 2+jngs8I \ 5I̤g2t[nk}zDEz'ݷ*|-VĀ2-I0>XlGa<<[ Fuz l?͙5:_jkCߜ]//_!@`;$8g85`;9I|}>JZ嬰l5 Nǟ9ɐŤ59ulZoMnd7Qéo?]| q/])gxc"a]kj'PbTp/[?H(jČ-lJ6^myv, Ȏ- Uy-~ g}]ij$# Iz //%nC0*@R,̝` ^:W&?A#b߮؏+7]һf PYY}_wg&B0i!\<`+:ė H m }ɌH<-8q$at /^*h6z£52yT1XxD v~L d pqD7[>XpcCvR[ fu )%FѮgMoq0/g\i$29NKnE>/a/(r,]YJfyaF!6>NngX)ŰaHӹ%gT{q*@IiucRT!-sξqt)m}R $TLtTxu>+nM2:W]ٱ _Q)OKfl>l;f~u漺 u( 9CII ?\4Zʓs/k,g3*D2(/ V[gȉ<欜Ikˉd9īXu!cIqNZeC "a3~.vI^ Ph>aֈrA mP>QuKk$ⳋXXU!WV` G&.$`P bƢѥv^`",g?L$dg/")&} ̄@9URī A&qAeU?OMq_Mb]A % 9%$;lӳ oc'EM jI Q<>9=r??{ʰqx{{~f;nF.ӅglM~ɏl7vyj)B(%  xΨё8Id;ǻ77A٥]+~Tл؊$O7q [OGS,lqztf-IDU=Xl([V}zqL81=8MHUb, %b0o=AYNJíRCNt2戕؅P*.Thr|8|'8tT>T95 eDĘ`֙gM+ĩH: TAt"(]|L{ _Hv)No7~w'wښ+7]V!ۛ:W7N4J%#j(sTؕ4 nAZ*^ ٛ?K bSD@f4'O1K0 "RPKt։CR^DDv/gfy`Ʊ7H^.nO݊}N} aL *M%偈?{WFn0_n> ̇.$7_lm+; ߢi]2bUX 'X8҃Ref@yd6_lx+=7fycb۰t ׇ[i)OwnJbD^7}㡇rJJH$=gTbڣP!=qHwC{\LwCQ $DQK4Kg( G`4r@Za9kE9Ivl(P4;A"śDێJruQ)p֞a$˖ 4"D1a;BDx&<> -$C*9uawpE (Z%09#DcD]Qib[XyJ"Uy,S}W!jTcGW5N)Y9e΂+$$F!FIʽg.KƞtKF;Zd**-S9 0e8R0yA)E4g438x&lDZǙp!u`al[y DA_Q7#D(W28fÎKF4RD* de ",}=BDB[u/*KbPʁ܌ D sLfʩX\1'B),lej] ,p_jO%ژUX4r~jd;;fTB`:zuf{i #\`4x{8>†? e~37K2BN9o: Q_:-D(DPHi;W1rίhbjx+yb[39'x<͜X1)/׾[Ǡo/l(qb&pXq7&SDPH0!:^x4Rz'zoahj-yDYo/$1 .Atpu0E"F"-%{!KMUq>-AHg[:u8BCXaY1+bI)x]V]Ƙ%⺏qwNa5}d5wp^^QxOK`y(c M\{PR pq5oD]zZ BPu%H>d@=h#yN)8O yas*3jtfN XNɌx[uE`՞TQKJֵD]k]!HWRm@uq!Jucj0G7:HVe 2]9j% I޲gTU'hL]9Ft}unat(uFt}u멺Q)Y{VRD#u0GAp+í5dpR;sZ]%k9RΜVSǍ2GտA$o]Fxsy1A>N7?-PJX;FK"NӳeLfwzn8X(iBꓩK"N_g˸T?Rӈ_דsaB 51 X{RH4*%ߏSMd˱T4$S@GR`pQU֊ 0 ;9=S_^o  q6A~w - >jM"C BB&h,Lf~N BzøǷvD?* w$i>|ނ%霔= Qtԟ$.DTS*ֽWMEx^uJ>\6_\%6f%+5] Hه: `MkT1Th$h.Q|UԆ4JTRx9 nJO U d&)LkD/^…tzeY詴 T,LM:;D^"&N%P<_`#_EybnwY( ܅lqx/$ijsECc? îSyW_#.;TصV'w0~po~6ɿnb]Sf4n⋙Je &+$yŇ/=6q_y.XvżvCMC2j86L/T*X:z쳄LDB&Fau2BS/ rћ:Ѹ)8&=3ԌżBs^3.\LcGV|aܷ˅w \Q;J':=\9F72@%ނ\×L]"W)ҥ\&:=J ږrWע T擡MXY,0-4 Fz(S,LtӿyK%L ?dYF(R+͐wqJxihkK9] iC8jL vh'CF1y" k*Bi4@;T;.^;GPg勲w&ׅ(];;">`j}JW{[T$'16u&tQ ӈli2elJfq2ɾk:gLj3dÌ?`ܹhN 8ż$NL4zQqq)zl:] 4;I0-]V7%MvZs8*&u~BôK:6(b^I;N%\ŸK|Z EAU",XsE5 O1Z3x AJݧ?ǮLOc5L2)fa?muz@Asxih4X0oo:8 v6,V92EMƄt;lf rjB6˱PN w+@^@mu FY 1pK0nWoaz`^di0q/,JId e SK6a^/Bq!\ z)y R\qB3{1 (2Z/S_/,ۘ=U 4|Y೎:>8Zh0vI?\rp%gLJJ:Ag$N}ye Ej1Dmzu<%ܬA}!`+ \3< -qb ˄ڋpPeXʱR5`d* 10M{`ɬki#LNZ`; b,I, ]m$ޣ4hS9 AT9gB:ǘ@&2dYNΤgPOD u~,Mu RXJ`vwΨXo((DVcL-ʀfo-sHy'Tfq.'sIvT 0 .K|L,IEOZP?_Nj M:mC7>/RinNU1V,WJȕ\ pnCm1)-Xl:PB p xSw#Xsi+J"%OK*/D\S^#pq~hL/F ,ŕ&mWo |‘ jMuzp}`@2`*^ W:K֢QYHiWr,8ַevl!IDS@e QERUZHtbk `Iއ3WQ\9XA̸;fX>31޶ <} h h淍޽) MSU 3{of\N0rg׽v LF.k]htO/ ,?NOz_&7s˒RRP+P\ePŽC(ڸխa| GԌ(Bhh)խat(>sŶf@,^Ӻ2_޶E%aOs^ES.ҚStuHnoMڌ旅[P3 x?Ok&RT2DvMrF.ecF,Ns2ϜV2G$p:+Y|DT7p">R% @k;잠[&iA=<2I)"rHH5hfEUٽp{p"qhQ2>h)bÐRO1eK,xwg̿ljUHޣa5̘ Sjjoq1' 51*&8}>, f-SwL!uCØ2+an1j%0GqBWU>d! 1I) G<hZ_ Hzsw;jIM.Lx$| E~8vK±.,}u߯YPҰZ\fҰhΨ| cq;4O*5QF㕈ѣllll\TA6#63kLj;\2LS'; [GR_^YIX]3X>G5~8QUgc l(!B2sÙ Ti,ÜHTld nq2}MؼݥR;ыG\:92 v/DҖSOe>Pe"rRk~HvB.(b Oۑ7Fw|괗V]\JGntڝD% JH8 ɯ_c(MpvT$B~ϵ/kf\%I@wyӬM_TǑ:d)(yd@59qM4&:!ISfEPj;RPx>V?hQO!v{?ӃX0OXͽfR]uN"̎>oItJ8b[qNpȹ9 M)u2KW]g8O; Mą \Y N\|n~qQB )E]Q܎*&]+5DNx'vDvyx_E("[%O_fMr+྘t|ٲ!10.F})2b"r2L6UgD{[F~bU??dS?W<" Ň cgF2VY>Q & |bB$ *3Ru<͇6~ cfbdH1Ptdb"D'LxsZϫъ_z8I_kya}M<oA0> ::ŀ&`|xl7FbIvp8`6xݭ@6N'p ~=unurSs*pA(@{ڐ]7,!s 0,C!tsbp8&\)"mp`RGCK^)\ơə8*4QEmO2&ZH2^QDlM-za/e`T-ejkY{@',ƧRGT4e)pKY 5rf1s/sFwlۇy||XGBBR`w`w.y+k^ =JLTo" W1 $LG1rk`aZXRS$Fji7,4 *!Y~Ȁ ڗwǹ[BňP12K8UV\[BSieRo2iD2 ;*LL l{ A ԁ= qt #"b*er\!|- H 2 h#+bRT1/#ATqR늁X*a\, OpeQ+=lLTܱ3+&1YZ(s&&p4'b;A2SdV(o 's$ 7Xa54C|)g5y,YK%|D#F) ?Z!Ũ|߀BSLÂw)!D7U Cp@#M³|xǠDt.ep±RM0Oanc`/oE%w+(^`\p匿K W?n?>f%(N=8ˣ, G8ӧ̛rI8mܓj~a$q$9Z#%B VerM"L &83;(Lx).\KC cTep>MgZ7Kzړ[%%(Zcpa;&CDxSת*twoq(h@aP(xY6Nupzt =FgB]"JfH:s.'uE>u^s>/i>mJF.FN GtT2vDimRթG.frk J4 uaEWZ,mFٺk9ŭ}R~Hr߮gn6p۰DžzY~Ww7[V}wEW(~1KN"6\QZЬa)_.oo-7c$%e[p|0JJsY0B! x]W٬o2( K[$f`ǰ.w 1#ʽI@ϽfAآo1 /\w2hƈ)z%g&aB3wK˩ߵSC}S\J2Jq (:@ZMh dLH?N~$-ABI=8)hO{5#Ǐ `cvQ0,Ǵu3ӳh~?9N]yV7W# ` A *x NMuz ܔ4[Epکu_8뽵)AWP5 IMBߎF] Ë;bvSV<ˤaU@㻁 k{o^O/N._9f@$c:pf]GeX^|n -FV]5f*OdKװQ(3%|v8O4EP9p͂Y /Up)jxfb#kqB#18+126EOr!MÙQIe7Ϛᶘ8Cۛɀ;4 *<N)nvf"= Υ;=TVZ/s)8)s)zNz [hpj:jdq/n] yz˓D/>,ϊcZϿ6L@y԰|ܢJ2LЉ~2^C3jTcNQ`[/A M Ri5,w;Tkбj =JRlOngE)uy V_.*\d,wYYs-ο߮U8s1%;w/pGy<5r$]%A,+9.3$uMR&W'SsMoЍZrV_;E4F=&TcnN!m@/yjn+<[Etob[kλ響~yѶiJ'0ډH;fdJhn9O^ގT<}^H!6,cי]?88hF?΄m}~(! O('Ziv@k6{ixAup/'+U)pŇ|SV2NDW1/_hGNg8n."@BnۑfIņ |Er+\+#t+d }Dkf/?mlaLJOTR$eP$OaVQc#}UCE~|-?m(Jd]9 ;x1:hF }X`X0H2Aa<7(IoA!ܜ')*^C<ކ!A\uyrJӽȨO$O\jDZ[C ۬yk9@W9R$L+-wYr'T&j0^4PPK HsCL($%YjmH.7wI)0V,[ EWMRH"ᐲmf~ ̑h"k^%y~IM90LG0 3Ɂ)d8RHh= Jh"=O<hLLAqbLKh!Κ&Zm͐1(!c\1-()SN8XJf/\V6HMaH**aUIF$[ #2Hj$Mh'ir04X4Қp SsD-VgQ^scf>He:poѣw! IAjɌVäf9'KvQgd3r`c9ڧY4IW`)"⛰RXD :K$-I*9OPBF$ki|F Z1 ) XeBX:4߬ϑek/vLmT:1eX+}DC8$i 1>"c-O) |FE$ .,1&|k@Z*8sbRSܡLC4 IN[GrYg,218ps`!OiKZhe%V\(V)Q(FMShHLZ$γN9!q)9GoiJ~UoΓ+eח璓l.vΞnՋ_̟n}Z"Ǫog 0#S7dn@|5xMFsYR X`񘥴Lr(H$qJ4U! (jH?sO% E9(]p x#%9P@qYukrN7'g4@Ѫ tS&"#QgH*0Ci!m?hېsŖ`s=0ʒ8C)Re(RnN|B^ F0w`` RR,\ ~eDP:m yVm!^ ?|{VחoJǛ…k Ϸ7ܬgz{+@~ooJ*]E炫-YW?"t.&Ԝ7 ?~bWˍ*>???|g՟?6 jb?E-UF1w=oda+M97 ᆳ-[nPѦF{v"m+-sA1m92,ު5dm98n&7ᖃRN5JmyZ%/ZߙЇdm:ՍMC[Z#G+tV d+{ *Q))oZ7A3ȩlP HBz<:$FH*,{c1^mWUh-قp3TX0 BPJVFMڙLt9ljDoUp1G/GRaybPÕ>t^Djt^cXnm3%* sNwEi1-K3V#@1,J٘] zI܍Se𠵋ѱH;8C4j%^ DMKxo8UʿQO 8i|N1T:[ 'uVTܺ(V_aHDy/jуVF7;H&eK!{5QӡF%2;~ 5*U(&o(C=P㝤,sevWLֺHj^eW%,yU_?:? s)oLM17S|5bn&k,|3kUi,: -S̙+J$q2Vd.sͷ{m6mbl-ߥtg-ja'F2Oî#P9x(]q ߣ8Vy* +qDc5*Fc .0XL.۔E«YIJ3-*-L]>|$"bG\D\t!W͙~EqMUR]i?(l{ݧ~zQDh6\o^h*iLЍ˲C%w|`^r[f[( N^^ ʓYN-w{]m۲\zÍNCYxRnw~v#~}~zyyMiI)g ?OoEߗH\u7 }s~F It/~{GHolʚ_43pTg[)<'p{ϛ:q .1PsGz1;X~/Ww!Ճyt-brZ?t'R+Vs|(!02!U+-HR`UF>am(\2k(m{I"W?j Tޜp-b l?\RHkt&;*p;=6KRD]xsˌCP܂UIEe91jost. EcvZfШՙ8YfKaT%GK뚣`=9[Ja,PU-ʨ$Fu"bE47"<^V,#4q1diulo!%]s{uɣj2լ1-MiD!4YS(rYeGfb&O^]1_~O*&GS[5qvS#píȢEP;S<җƒ ; +QRsj7;56﮺Q=c?UZU(oݤ;Fl4{Ö@M.٢-P@Tsf;4W_n3n3.Uz 3otmfhPNW=!WF\6Fb<=T7vmtm)Ō*x0_Up?p(hGaކ.{e@xLz֎h7e-%\*;Yy/jë Z`7QG5MQsqmc,SFN0t>Tw5,{ a~oH䫝oOS?ϔ?[pTϕmKeL:"9APQZ2߁Ғ79G5kqsW5£9m-,c{vH;?xm&4{YaɺwSR1xV jL}!Y%̻3{Wz1,ٻ޸rW 0 Ee}vEa{8AeSRe@Ǘ9IQ$UtN!Ѻ 3v:bp5}h48%`I6M1F9j2x7iUlLjg L0k`!uE[f2kZICWcڌĂZ6|AfcBZ#-o 1􎈲jVnփld~N` CyP }Glb,Ӷ&NM~'UMCCs]x}(-7F(4f Hܔŋ009_g` xHRLFR2\o}JɏˆEO9:ƀU-=6 x|u߻.T3JgR G`G 4{&]:|6ωLɌz1.&~\\ y x(цthbƓ{n:FSHbtо HkG$ H?U18:\Sj5N⁉ I LEW}RtO\\@  J\TVV駓e#Z@Uk/;vKn>۸x}SV֛4;yZi1x K$e3e*K<0F_P6IEd9Kc;y_нG 'C_dk ɲw ޜuy5Ye-P{kyRsLʾLB7]zZ(71[EFcsd̒3`>[ToRQF]h7h)T %m3ZHlvtʬ/* QLYɬA%FBQ+d8nUT1qw,ۼ&Y/uCM i#Y60SyFHR*\Y'seHrl@VL=hۂ\4ø6vH$A/9"#rA3IGAQ$8fc呈]#({bAwx\c~<؀J!ciG)G܌[yߦJ%:k 7$o aXC?tǮ;h]5[My%Mq^}|sXjlSܮ}_nbt;ys5ݙX=ӫQ`u^jXՇkݒ1EMσF6FV/-pF{˱sPY?.Ïg5=+*;OD;X(Ĵ]6iwz2ݮG5&$i b߯GrU Ev{뒭<a.Zhp:mBWKFR{F<);FmGX!\}r1tBoN]cK'[lͫ 7$XnLJH6D3wSn-Yi_/hh-I6>TmB`l^#2iSeO=uS:dL4OeJ$J`#hW Y|Ҧh }Р3HEimltiO 和YR;ÐgsơCuqMGD³'Z *6;{ ?/a@>z #`$q&&BrdYn=M[ \jwe"]0rdS}j*O ý^ v_w_hyXx5G}|Ӌr6էZׇUx\22kV3`uJ޾BT}`uKX13>D7:Hg|]^\\OƜ)F)Y3NWgL/`˦W2 Mnx QJ;+Q/ 'Q*5+?^P%&=ieQ?aw y))n3T#3S~jb-?`fLwOM$l[IchwF3W4=^}ڥ.Yi~pBǾDf(Whx`l/]1/FfGݢDYN)7G3E׬S2Z(pǰL.XãR ɾUCz&=d/|C'Vɯ\i#a&Ql#u߈\Oi?e].HO=hCxQ`ul9`lMڿ?.Hd>- |ZmBJNJ2ADF ,$UP)YǥJeP }A/Y1RùoU0 ]Ia82j.j U+=蓂 puH yD{BpRYp19yn9OX>Guh{5 tx-Tw/@r:r)@7*BhQ`roĕ$dr]*>2QH쏕*=_.Ϣ9mI=&ztrtwgB.C*4BuOy9۷- ޡ>ⷛYu5+fzBL?[^E^oʕ_6_R:*WDs~\n-w]\͘ުrTޛ役V;.γAzt8KH:n{-p~^Pd06 Tqͫ76Z[jԬl}O B+w4H3|9%$#KTEΪ]&+qweN+].Nt~y\?˄?@.Ϗi?uri?럭o3w.M[_)u꘻0WyԎ4jX3CRù8?<2bLUL\ii}jܗCl9TgYh Ig\M>rdxhHX>Fz{eG͚,*opոZ+6\ R*XA.By>+T&F.ЋE|ryOht`LUH(#<$CG)3)JX*EQZIbA8OEM3`Szr%tF1p#đ).L8˵)yW"ro= 퐬QA.滐%"3b<$ 9 Ќ8󑑢5$9Yu}ɷl0l5md&[YzGsU]狽r3NVKjgw.v6 e!ʩl\X}L8dD&u J9Rv_ W° i/l  `\ "XyO<ׁw.S%st>߃MCxV к Q0#p9ۚp۠AE 2OAB=7/;^y!(ÙE:Lq}Dw[:j^/:6C=m8Y0nr ːNB25i70/0zqcƜֈARBIth X^3j6`1㣳Δ|ML4{p^dG.}DcE?Nh%;+.ڈ% d:;s~~* /[ zWW͝Llyh` ; .:5.ࣟ']qsR&'H$|.;w=20~)48pU̐,1Xadf繴İ""3/gK_x˘.gk Oq-# mx^pp63F1X~cfhy)yP R F#Xd:cNhm#*ߏ| m /l[PZ1j2 &rƐ9*8 13D%+3# {t{Nzx}4m=tb#_LV1ί,_KL0Ts [[Y8]3ՠQuQ[8pf(?eF y;9oNEy(rOw/'^<3H\Py4_淣:O +,B@[~\gT)OO kjrOOH { [)MrECN@C-jeA*DQy+bCg{{*JB48@pa% > (ajzOP7 o $g4I~BHhBuo}~q-֛wdO!9 e;ڻe ]ϧPN ; 捁39 jjހ` Cs`-Иq3F >kJz8FDV@4Gѽo`QZ{8n#sShܺvI h3h/FH z97vZ)uxQ1=w4nXj:Lwp=蓟rU4}|UvN]^b4$XwJJvRݭL`fG2i]{ԮҠU{Ƒ ZiEنqՆḛ+۰ _LU A5?,PX2Wjp; ?R[LɒKBY ]+6de 1I. 9&_@e ,Q&Ś:G=^H@B 9(Bk*/͑Mc C%wA#! #kߏ|'"HZo^EP dl㡗^okϏF!2Z~,Ŋ~hiڞUSEc~Һ:fh#j(61ksTsmn-G$o>;8ksѽQjbU~GVnCk5)7N6Q d;Xs$ ۢ;Ah )m+-iv%ݬcP2?`?`cQB߸`]1,T)_xth5wbEqK'G9Z}{nGhh5sr;sTrݖhzߤD@kD@n1pXzV'&0<%@<6 nd̓y# P׼NH) uIx: YK~*YoYG։SQ0oH ӧYxLok3:֊3w]rZQ4hX%;I4"ul, ݡL͔/U(LRH 1LSC|ZY# qJZVPTd֧A^鯑P:1wLk]Z=6"+8׈ sS5H8\Da "UCg5Ƅvjpn|?%[8SF Z`na"\CeZab4g IP!%$I ODF(D0͝f:ց̰yRtj/@pk-E΋LNgV2`59^E'i3{R)/Zjر+b q.)o_hF2_MYGq1 5_h on;%] ƳB6 owXwŶx}\UV{7MdQB `}B \u| .%dQ >, 5Oʏ`L R&{0ӧqU2rƨfJ;6泒Yr !iI%~H+$g5nw*CEݤ餔K~< -ߒ#a:͸/<$9$ 15 ,))N#krq뫱\h#8Xee111/&َ5 1sNo$6#[DQyk}3"ceVNZڛnlj,9}586[lfTt{F$z*L5 M]vBS7',2 kCB^&gooh7M> H(I\v;krbJF4Z !\DdyŔ>p4=f6ffմل[dX@*Js 6w:6އ [Jc=&1 PJ ~g|؀.6Ox F?Sҍ>ӣ"6$䕋hLMNnvKAɘF#b=٤"մ !\DdJrOP ☗fi Ui>vFJƪ3h_&RLj:k8L2hW uyL}K7NVQ:njL +mfp?=𿻼% =#w%&HwP|ҭ`w(*A]J:K)rS~ۅVZNhҕqU4&ܟ)K!w6Ȟ P+*PtgIw"δH0[r4:Ea |-mX%!S뿬 =rru5)0+e! #;"Noo87eLX)H muQh#с13y6{&"BlQe Z`)yafyZA#28)2o w!@D:~cfhy-p+o3F a̅ 8| m kyV0.i2o(hHʴ83Ck'"/0er * رRg:w8SgPx13ҢeD2Qt dv?13|Ъm.{ 1my9ԕ\e *gd]TG>fV.g/n1gYT W~cfhy"I{>Y>cfhyJ0nRp?[X)ٿ|ňy-l,"lLAas98uh+#CdkY(1 I,,k~Fa4)0]j Gs?\?6ğB|/?vq3>hH!RfnLpX,F?tɗ_y?mpwKNW\sfYMN"q :iGjb4]J5v1OӍ7|3MkNE)G[gj,?fE ~ 2pTZ唺q[-d% MX;C\0jSsnr|(ϋ3N4rOΆ$d e;H)$5xnuAiaCHK 7(p15Qsl /li#G-Kfz ԺVٻn$WT~KwHIvSZo-Dvٚ R#y%ơl:$ h1#3iݾscۆۜ{>5l)~xp 'ʠO>;}FagyFa "۩?:YWJ8%%oߛ 9w'n\#{uj9GplӪOҪ 6J痘5;=b-V;ٽtg|si{;6uyf3`ɽiݾ4&o%Xk3tBjj>o\|ur"Ռz1[GhIH#noZY7N~N.^7'߯`[a >kN ~X; ћxB}5n3o^}z] &TvOs"-ݤSh- lD0%rU6޽a>0޳;s{)t?rD\v$@7Ne'OմNɩ^i{~B$О3_o ,sw*b0FHv&BkB R{5iMZsDù%J#@DSUy %^*ڿ?b :=杯Qh ȁXd\J| IzҲ:tmNZjb "*ʊb͔7 /ޝT.xcktq`k|;C^F$*REoYTD&ۮa\W!7.LhOcTkFmPgbڹ|zCqUԹ#%ٜW[|"36Kqr.^۔O\cVy6 q39Pi\˛}7u_(v&fqֻӳgm2";y҈KZJ}|rkS\rt}?"de Ol53EtQa\4"*/ud1|~vRӟrA{wSf[2.kvxS(EFME]=+#{IotvL;Ic Jxg0%#'ܮPOqR}#d&fuVsSRh5bۋbW^ᅲ*p4JowؔI(s~>)\  z6ξV'\!}$0(푈 /q\{1sESMCP4}!BGG"^ڎuAqt(/PzRwϸg"4ȣg?"@ʣC9 "1C9b-);Ŗ8f!B0_p>VC'A^|E`s<+hB!4Y"BY\ (]Xk aj'7nNft[h#e[&%b1 1{-9I&gGm c S7{wAֽU|F?XV}ڶׇ,[d㤟 ڻz~V={1J%:mAu-q[mN C ۦŶRmGoUsKḞqM#7qף~y }[_>Toy9uI7gQ8씄nHy`djm]kMJzJXYLVfKt +qv ꔃ#}*+kK|5pJh0jpGYdq8\%dT1e笉$VQ4nÐQԒvO]j=bYY*IV`d[-cP&ԮB@nG Z1`#[rADz;6n4 ŧQ{ s=an##e]c==lƜ!4K`HT$lhtZeɖL'iBHDUrV5C{ug|9r4k\z™aV6Zb>N^-mgfw(h%m`rܝ gsv]>$EV$DdprJ%Rr65UsډʐU6;IcaIjY|Y9[;/7*D˂צP;I~ÐQ#;n|؆@b(\S.q(GR?ŜJ2fP680jk|'E.6L֞BB Bֹ]H5O>g)ˉg_e280jyĎ 'B>(-8uEĹ<'ZYΐ-XtYP1jQtQFd+ȱDc sU`a súє)gYVݶG9v`_OY-jLT4[d;x䱲D|o[q^iZHf,"s)lL3' [HxV=Jɖ>eWY[ʶ"C t "zI򩂈~HMx'Tr TA8jndf+dy衲bCQDMtjw\c5ULtHꬥR3&88-4R<ҲA7KjK9%9Q+JE[e1[.J_鷩k<"dUv){C2n$Ēq%,KmR,.[KRe+dʰ>X2 AY|BgOUOo޽?`q٬ßf704eU\yXSF<_@є52yG$\deHڰ8-l@O;q95Rux{i3z8q[BSfdml%M GT"BOʽVb#v-K3V*´^q`UXH궧lnzb7IBֽa }l`#UzYcizogz{6ȿI2@#r yy.7Oe*r,{y,m>ǰSzAJ['͚]l+v3z&(J7w=s_ůt6eX+8*u/ BcEERw9;Og9J5tu-QLlp3z]$_+F2<Xcz-knRexp+N{91;yy`O8=JH<׫-4pVƱ<϶k]k\3(OYYSO48:s3ՋihB , @%ƞXwii~u Ԃʦ-+@2A2t:v馅Phy˴fO 1ELq|[ rͿp,ȐS) Efo^\E8, ai dr#"&_%։"fI,m7Aa^}k]WŲ*+l> ^bO6cusZz Jޭ'a囓ʃmm~BSԆ./X]bO7eq[Z 5y?&aْorߪ,D̞`bR+ˣmeQs7nD RZKE`jl*H_\ؤZy1Ъ1 5QP[/9!mqJ8@_O!+܄f]XC pN?As1n,L:/[ߧPc0:F!ZC^ā̪U ̪P|NSC[(NPSAIQU&Vd ~a]]~qY:dk(VX,dM쁋sTF'E9}k y0OZ2B5S#'S 1=GQ5/d AD0&sLgivwXG%4 K|.+JY }h`e!ԟjƁg7SS{J29|wN<ar .шu am:7 tӈb$+gi9q!Z=s8֩x:9lCjO0ҎEQstsmCzmY;v$ mZ/i_Yգ [:굾 V[fww5m.V=i׫?:ֿ7T| B5wܶ;md|Ͳ zZ-)Ż'gWhӿ 0c59[t>+ Ǥhl۞-q]FAb.RR$wRUk tJ^ʭl~۳ѩ>AǀBg3ssZӲqɛ69S(YG߰- ʙY8Z8JH~"o1]oU@&m'Et tL1:D 7l޾Lr03ח681a$W@τ9+&%8R%2(O895Vǭx_rޮ*d2Vr'u,$[CR?{WF LRyG?E3k/ "OcTݳEJ(,DYc?YGq@̲h$33Mֱ}>X-5& ڤB}dIm]MYT[$㞂-\mUF7BJq@zoi4!@ecUZ·U/BDnt{HIo*n&hkyZ6ˆGAFց|A] t!P՟qC`Nh=ק[`/'a(pp#2ڈlG}zB"9F1GM8Tfz~~PzjtF]0R2Q'y O+ R=1@I,!SۖPR]#ܥRpc, ..s2@- ɁLR(;wAec0<B+Q;Q(N>JO¿~Wc,%SV@iBTG- ݼv&b}ԟ;f{G@Pr5WV|(OPbKV`h]e[+C+q'̘R|&jꢓq`IT^+xQN!^`b-,q5y?}zJn5VWb9s$xSǿykgPțwoo?Uf~^W8PAgLZ4~lRȇEU7b=UP4(tZw._SP}N({f=b&?xk!5PLVXYEm+ExRCbtRJ[ !l (te*QpҢE2&1]Njgg 5%KĀ B*-XM2jeKu?,'xGOedtP`>$QT@['>HKeyXfvw5.g͉GUMe (*sOHMJ"t(O# ʘJ8WkjJ@ kBLSZ{S4;f]-J\5*Utr1PY#>;?# \|UF 7?վ}R!a:oh`wjqt:̛7Va+[|@r@m/?uӄ\nZ?U&E%tRi4p$ <[KB.н+JpP]W59k[)`x^c~4F cw.z7h~@(JL)HH)K<_޲u>5xo4KVMjApT&ȭ.Svk43 qCОhsz3 !Wb)C1^po'_ӮL"~1O;?^Tҝ1Z![%/qw{?Ǜ^ua?"z'0F 45厴FTsۯ~r-no.ޢҮF1>߆xT=7#R(ͥ3hKL1-OLƽ$V6G6DPN' ;A uZngӉU[ڗ7ι2P6%^gFPUTܘx~󈦉]S훷@+r"Zfw@zuovOl ![;Dʉ½n4h5bZg>u"yR.IԼX-s'd)ar"ZrKvRƩϦpqVSnj6ڬ!?W;nՅ}3neg|1[n[-Vza ,MϦW2{YLUB>yf(b 2/n y_c٫.#̒tT4}DzEWqfGMfg㛡?t۲AEԲŭ1|/0׌KEޗnHrAeYv\*[!]QZLv_=:];n'P[#5[=߼6S }к ,[S t8ĺ20rm_\֭9SMF7G)]k3S 2(E{c&CZb ɋVe,&8˵Yeu0oFɕMW^YH>pC:LCz:O8_{SxQSxGרu~8Pcݩ0ʼOlex⥥$I0|P{\_R+TKz+ FQ}&R0vZ1eMdJ-}Udmjq ;s!\~=dǃX{zkDG.lٌ(7r&V+#_[GpQq(kkfMwvo8.7.ğm!)3 i{|̤Q#k6X>2NыhH 2#ȴ;}GQP`}Kј‘Ԣ+G0U4e#Es 40BĹ0 $xAk=jy <$Dcc5N`pz+A_lѰQN XНAq!28iɽM"L:d9+OhnG76z4JD4S  {g-%kD~VԴBv\! f3)UR'.Eqm8|ɡaG wr719&Nb&T MN$d͹ 4ԻǬ}>TXC2pQt"8|6Vp]t(Wͤ^B. 3Cb!5Ji\$C0+w!y][[i5tFlx1v[:*qH~-'2e][ sL &ԴR*+.FnvKГd-4 fNHHA|)k %=4(4ncEL_§'5М^ :bLqMUV9햽%(LtmHu~ )=}K B%)Om0-_{=FRJIOeZe$W|E&O)pK)5Q?Nð[{ioK;DOe70mIܸJ[ڝ??  Lf-AWN)%n;TF%`G:WF<'rO h<کCM=Ղw't_!OTgV<:G] rIpˢR%=0qZ=NOqg{TSa"1P$KZf>f4SժNgJd> H/\VD@JSRmnx2ٗ뾌f>Hϕ;1h\!ލ Y$kԮ5=3uIeVEK?H92'1 ps@B,І^IVu} KS추)qP(Q>ɤ =Պ䳀}BAۉ 6/jAVolYء\W*{1_Hȃ$W:]n`n(U͉L/Aƒ *Dž՜T4ݥO^ ٕТiSQh\\ [*.l-dY+uwwKm!zZ VT;];Ae1Oz=іYC b-.S2W[c͇?yʊet"DBiA% ,噄fZP ݷz kN5Vk򄣡[lrCH)R)mVldǺVpmʝR*Ok83eE j! %@LE,}?Ǔ&L)KdRGbD ٸMގ:kooeO*tu>{uT;yEIOc$(g )85]U~AZun+}wb3S38P~aJ; ԆR${v)S*%&˵2k"1M742^/#9 02Wj#b4eW,B[ur)6M)6Pz*zL Jz0I3)Ymc6Z%&Cу&֛gtP@grtiCarMk1MlfW7oa'fzYul4Do$~1Qz ޠ?( `t8)$),la=Z2g,3SsMp%ql9Rh_,I?L(͑v~4]eږ7Ey=ذHk΁n1 aV#ז\h`BѶ=ZdŭA2RiN}W.YV[`fؘ 3Z(%Ctd聯N |Ziv\+0vZZwJ_batTֻ[<: NR @벚OŶQI8&bs\/+ӟ8~`_Ds @cth,-Ƈ߼"/+Kb%͚,QP^A/kJաE5)RHz弴J+ 6Q&s (Ak`1L;AUFPدPQmm"d &63.H*x͒6v2GZ]sf,`A5ڽaG_L""cf0 VC4{N ,k|;^հ`stL` !0Ȅ|\"ZM F1jFי-ś Xc*eRQjRR6)"u,G#搮8jd;Ma/>p4%0JH(%^bp2Qڡ;b]0Qy OFՓ HFl|d edb,6ie 1P3 3o=*D` Y'سO1T+Iw^|CpFG/ObxB%HJ.yTA5GjئVj}DC@ET\X54o:uz ^kwjjL`S-ɂFԣ`#׃ jRޣ&mY?I華رEaYx("y0 E$A$$H(4LX?I}?ٝ4ԥno^xU9?gW䨭&uf--؅CcԮs[a̵|Γ-^IP[u_>MِϭK]&U>otM8jx uϥk[SZNr>ao0 nq,v}n&dI3 y c'M/AjLyrZ]$RJb3I#1A$b.utpyWDD@!?\ rOIEv?϶ lG0|I>1oIdITPڨR)H!d$ML ]K9{0ZeR+W@A"{<7|A'Bv)^ɐ*yzccE A6jnԆAը1pfIIc5 7 !EHlȀP\*1y#Kd}10N:T+G\p$ m^i@dpbU(F Q ԭ͜f IJ'>hB.jzM7znx))r;fnrOW1/G~@jIILXv [%HL{H_^ϿQ;_|>XwU2:<)AǻκGH%:(I"9Ӯt[>)4{'I-$i&H/(5]F;%B6w?GM6 2/q P7H\o9*JwpZZr ȠzMޫ%$ 6m4OZ/)Z@9J%ꬄW^'b,`Te^3Gv8T=>g1n?]^hfNk a2;on͖O\8o< ]v'Žt&?>\#hsA{cL, H(V3n !"H*&B!ii˓ckѢA5_%@E)U@oڇfr~1_ғKK)^:1KqX؆ylTr):0zyz2=]UF55RM3!ٵ \Q,nſ.deU!˷|ues Ob uNjiigw~Ơ&s ?.-\ X|cg՗|&b?xp'=<8x7y=KU;KwmYo2nl {5OӫExC}jHwtHːAm .F?aDoϪ PVM˃8Pww{5T6Z>DoNY) z %3]$ ҝSFzLj_CDһ^3>3AeMH+cP"^rI^;!Xn&fq4 hϼ`PY]d)Oɥ!C,IDK^7 ݋Ԥ%8\S#hlg b&/QYB~ B$(숍N?SVZV`ll r&k9;j8 " Vjغ`&ȉױ)> ',dxu]6 魳&CWتɃ PJqVŗ<#*²[<#j7I`\eF5 ɴ͈ZrRJ>xrmFR:=`p4`Fԭ,Rg%3*cX ÓKu]o.,~eB欯w IZL\]bjv@"z&Cҙ+)t^{Vtt6jӅ3 8Щ1!04V H adU a7. JyAޛkaX%Fw^[5Z($/\qJV ҳoOEe$h4dB v劉?d;[{LVaua/)Jv6( ߼68*@Rmh,e )^6 ABKQס tjo"LniH _ݗKh B"2XoFVno?sONcv?1hRKV.;$`0tV-97i*%4%K]r>q/ ة(dqv`f-PɏXz&Uborqu㊟\ҁ,~I.|JF*E.GǞg~_{~Qs܀t~7UfϱdS5VLżkiodz>ƽX߾¿kl][o[9+FvMdUyA0ط^cK$Iߢl'G:G,ս)VfX!Ţ$1%d& q* 5!Y~SJȜĻ}6 Uj^JD*IqBI)div<@8ЊaoH3 EJ7tCP$9*p}Uڦr~_ g ]‚h6-g! s Q̐Y}棛{zvrgQIR[ /{/*6gĩ, 6߁J[Kno<6}p﭂_.>;/颩w]~$+! VB )YKV)&HʃXKWknfc>!d(Hf+G7:ؠ_{W&ޞ=6rrY5xfQ K_P" o=z/urgpl^X7e&e[a nfE[MGY+Wq!ܦҞebR\ԋzO YLH\*yF;'G{b-{LvA{ SIBfxNoOI *d -ie[oEl<9I㲺p,\6Z f +DUMb5gT=X=7g [l3ks.eRQ=+-wZO&#=&*P Oz.{GqØ0*.PKJђjD!L%9CEL˛#gcTa(s)R);I%ș4WRjlmroт[ms/E-Sw{)[NʚۤWIZ3%h,hJn8YsEd8DQ p#٤DCj 8H-SZ99vo^|u~[΢8[e"wSYPA*Hہ'>M}ys?$л%?VݲGSƻG׫#o|y`5wkkYo $=7>, sAwE3v EKb[˿c|ɝ6;dO{fv>o-tS||{kq3|B Ư͘9 v>hjn±=w><*y[^@'P/|}|[Op}+[ʐ~S|Rk'n[?1Tփ;قf,a+*1j?|3m{Zz=fLkmlK.rŊȨh(kxW*/vY _ut/|ʇߎ[!wDtszGWo߼~1_\)Z1Q`:S&_}.křPG USzO[v{w_=`F=ύk@@PH*A,@iudj\o]9hKf.\-nOMGK yoGkcOVY>ە6y+i?N>֎܅^>]'>IfCg w0}/ qcts_RZP>8kjF:Y=GОbp w4l%VnG7 3.bڿx\>H=.pHhG\1Z&uk f):Y5{&b)F`[ [Q=Zé~մdW!p+uRmJ+ 7< FMs~xMR c:ΏRiI[˚mɌ֓ %W:$\RB"7 `sSn^Վѕ0ƂE.EFE1ID> SЦ /.hxxѱQA1O+.Ϲ8[ 0$Qo*'WsbC%Jb?4 7F֥9_Xk˵ipm( Ø>Йb)͙IPJ0[qw-l޵ut1E!}2Q!8FHo5UY+sGY7S P:Ϯ8>h/a/4SNsܱb3IrTb aw9@NkfwݾwwyptulQom:?K'}g?b4A{ސvNcN~hpo# C&`7^gu=e](-ͨ4U9z¨byRWC s'ϿB{ ѠCp)WY0$BcjfdϢt@ AUt.b)NZMo5}BpX>_YH|P׋l Pb "STyDdt~E8rsc6;6g(RkNg  ]`ռ= "cHَ1śnpKa2 XfPչD}U^I>\$CM}@*TRp+gN6hi},[,*OW8kHlh}L i6f$Pg<ƛmfNzsqiYaI|Q7i%ԓGﲑaфsh)|E:&~D4a} ]`Q">CŒiɚT5ыUəN~T3`w-'u,E[T#PiTd꘤$7)T,Zޱ"ibsSֺ s\Dl]|)5VcF㴄c@ּ9Pt?(ZmمR%Kf^@>jwePZ1j66b^Jؒf!7U=ΕJ5'Bnլ[&N$ @iॾ/BA-y() ؒ #,7D5J}Y1b/,Y͋A`w5꣆R}&[XSH&Fm1'1Db M{GpYj'9KFP&!w.8 )mߟ)9m ӿmbHh')Aup-բlMY,_4t@,qfF0#VioפN%#g ̡4כ w^4$?1Wq5\YgT⮑l}ns>Hy+ SQM5&y%#F UIFS!Zh!%\-7cQQN2 g.!yA"SC5Iۅ1H!}Xo.$Ҫo]P$Y%!$x_$E-C3-).a͒HGPg6Fr䈜%[4%ZSFM^5zN2Cݺ8r'u~qjP|9t te@5;܇#>|*>4/48ƾ)wH|Xt}b|Vv/#O,M]y_ϼQJ#s;V``y]0I @RVAC% X: d3K8%<Ã΍z`@ vٻ涍dWXzւEUypYVd]8yI5WkLT Q DZ|D沚rҹ5pdfy|fϻ ^}Si6ٌ;Zsx 4͸q4nJ]#J.800 uQ_a@1bjO3F i"v'zH%uj$;FX fpӽ*+N>P5!V&j#M%g6FLSĞa|HEC+su^]+S2z6 ) ';k_ ΝnX tRI/t)7@3I5HNSiG:;EVG q*UxD*4*g6tbϙ + u1"F3b50xsFaωR3 lSׄUR8\]n#D uqb"1۹Xa3cOy0gJ-MU=RiŅ@(S@Yε$;G8V~=]!U^ktFgNG-[coZ"-A0al7%piu9:}u_ KtQjEgM QYйDĆ`tTlL; g0e/ҡ=9wfXHfR0R6)tH=WVp=UɭfoTƉY>@?&9gzTnU0w%Ƴr;4`3GNaÕ[>-\֓c.z|rY-2/ϪѼo}}[f:] ËA A^\$*>qHO$OpW$,{yolBC/f@ҹu 4.KrA2+tVh I~fꦻG˙ajTtk~s߮Z5 0{*N(5@pZ:42 _+GmC@x7I|{}l~9gԌF0MMjY&JI vueѴ O!}pɏP|K-&#uW\{#Uv>;{F+Jk8#߁':‚E'|>@~~, 0ˬ7y^,Z[o-Z`hPT'8lAYHtqdy::I2Dad7D%`,&|m0RS c,,Qun"GĆF4?h2ҲRݢϗY3Af+ ?[0&lv"&`ghgpm`[(DBa|oN׺Lm-b΀5K+)<4lD]:y3NBn[I]tKl8HZ"woӕw5])YWPkxEplгXX*=I,ܗ; 3NoX+EEE[K\nnLMEF0"d,ev(pR]p7ܬ 7)cڦ1nX7"׾ b"lqXJ5[ˣGOñtd3jq:gyf="Dawژh#8##ڜdΈ_r z!%VЇ"Q6/ν< `DJ>" M~;TcӃNNK`\nOw& %4+;RO]y&&S "f[1k[K[EVZSבEš3~#GF)X١Qq #R.6ϋ\wb6:7-}`PU pAH0T6PBx$Rq sd6SiS;).(O|M@~Y>&ߏy>>t0J;ۨ-B w̷0ny҇boZ _zO]#e.M?Tp$@B:< hrgPdsgˢRoذȲ(iL]dPؙY )Rf*n%pqDFwHTi1aOOVD:THqR] B/i)иaשHV/IW) -}u%G[zZJQ5-#ZRiiA5'ȣҊZ 1d?qe+ S>j*)@C(B E\ TkT8,VDP'/k)J#M}N Br&QtH<OO)0;!# -C <&TpY:BB @'vhuNCI &ZbYf++>f8%tNhr[g.Fo A9挤jnDsi]cd+UuWƃ+E Vcks^̟hfjFuhL-GM'˾{G*̿a=}+XI|9[o+Niχ`V1ԠVcnQ; jo$(t˴.&ƅ)dY7byw^,ڀw^6:t3|]b,&Ӗ497\BGQV !`V8̱+}Oh*5Wktj!1G'55$ioK~U$tg+a?_?V]E=)a_n_+a5KGNfyY9Tw&{lx2~Ow/T*\_Fl%Q֍ xV˴M. ޑ6[]7/ѬkDϛ' *N_jW\jÌa.@nm.TŢ}/)/P `=OK-kE5PkJtSkԿmݺ[,:N KECeb}s2vVi(yg7NjSP~̓R5ZBw(k4ߛ!T[IgJ1z͆O0w^Qm_$\ѐWt TKʖû7gs7:9_:IE7X 3Ϗ 3(kUd*`+@d$DCa=fp<¼بc%\faQr?m|iOF״+ՆڃJ%݃:_Jߺ$ѤvQ+EplGؐqr)v4-a#dT['hCHlGA1RBC3Q<͉GqbjF6Zcw'e6S7!*N9̕Ld׏['RJich7'[Ѻ4"yraOQLq=\X~l!Z'7b7E)vS-G'Io)GdAEXJ.&Id3;a \Hq(]oB@ș S[Mީ(%qkEb}?R]ْZI#!9o?+-{XgA.YlIVrqJ٤۩7aˇ4:;'݌^u?զVzfɢ__ϯ:mOq藛$+-D/_~zXayJOe[ܺs#ɟ>~L3ҖXͨ^ܽx`%u|G9>3)+RYhZri5@ iq Aꖌσ zV.K+ j*|c[ENk[yY ֒l3 MZ)6L#*Jv=KV\v4ZvB%0cM!!>;(g/́̆zwsncDnP>>I/'b,jcW;jV=n ՊT~_@S9Ḟ9܃@GϮɘFMku ^(v;jvu_Om둓LdMf]ztP=%Ae>(SjOJ>(%X2#ڀM_1`7+o>=}6Q̵Rc^"a=X!hgc_ɝ&2^Oכ(L>`g;:Bz}Y|"W[ߖ bl=zחo۲5Sm5ZG~,F|Z <؁+9T9TŐXd 3SyLPڲkɳz n sʚYAc8:Oʘ'OBc̅s4`/eQJUdMOЀ5Ǡq/-a/R.-K/P}M5hc/g  x"BަZI䋕b".6*^EԂs%Y+z@c|ƨd˚h\n*H #CT. .'X&ɞz#`P!&%I ElS]e+xaT!xi]d}T_?](vRYX ׅzC54?o+Xf2Gh8 +Xf+N8o+.Ry'aeV`9o+E[fhWq'aTIt>Թ*RIu*'UR1A%%&mj_{u ԍu:4J5 lyAakwdUhn=u0PK'Kҳ1F7ujIJR2Rz >%u:L GZ/G7'9B?fn; ӣ΄nYG_b h ӓpьJׁ`@MpnyDO]l,5U(HWy.T;"F̒ >hٺ9,EGt(Ɨfn5ksqEih9V̖HpD&KN$媧5:4`#d.-M#1D[[4y-{` :# >9B9fԕIg=7!i3B9aIU|bCQX兴ڃfLo)u3GsێP6dJ dnZ.rxB,.(ZMAQ҂5()^W*(cSPYjV#>BLNxʁkG+j0sHVS~Wdkч`VfE0kj$556XW,$TH$PHBljj fx 3󸥱lڗI}nׅ18V׍^kDt 0m ,VW?_jvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004663414515154341731017716 0ustar rootrootMar 11 18:49:12 crc systemd[1]: Starting Kubernetes Kubelet... Mar 11 18:49:12 crc restorecon[4701]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:12 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:13 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 18:49:14 crc restorecon[4701]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 11 18:49:14 crc kubenswrapper[4842]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 18:49:14 crc kubenswrapper[4842]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 11 18:49:14 crc kubenswrapper[4842]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 18:49:14 crc kubenswrapper[4842]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 18:49:14 crc kubenswrapper[4842]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 11 18:49:14 crc kubenswrapper[4842]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.738654 4842 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744629 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744668 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744684 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744697 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744708 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744720 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744733 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744745 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.744757 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745216 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745252 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745264 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745317 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745329 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745342 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745352 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745362 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745372 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745382 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745391 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745402 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745411 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745421 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745430 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745439 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745449 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745458 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745467 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745482 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745496 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745507 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745518 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745531 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745542 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745575 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745589 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745600 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745612 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745625 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745637 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745650 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745661 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745673 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745683 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745693 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745705 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745715 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745725 4842 feature_gate.go:330] unrecognized feature gate: Example Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745734 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745744 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745753 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745763 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745773 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745782 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745791 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745800 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745810 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745826 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745836 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745845 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745855 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745865 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745875 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745884 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745893 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745904 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745913 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745923 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745933 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745943 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.745953 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746133 4842 flags.go:64] FLAG: --address="0.0.0.0" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746155 4842 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746177 4842 flags.go:64] FLAG: --anonymous-auth="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746188 4842 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746201 4842 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746211 4842 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746224 4842 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746236 4842 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746245 4842 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746255 4842 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746264 4842 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746311 4842 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746322 4842 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746331 4842 flags.go:64] FLAG: --cgroup-root="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746339 4842 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746348 4842 flags.go:64] FLAG: --client-ca-file="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746357 4842 flags.go:64] FLAG: --cloud-config="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746369 4842 flags.go:64] FLAG: --cloud-provider="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746378 4842 flags.go:64] FLAG: --cluster-dns="[]" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746388 4842 flags.go:64] FLAG: --cluster-domain="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746398 4842 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746407 4842 flags.go:64] FLAG: --config-dir="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746416 4842 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746426 4842 flags.go:64] FLAG: --container-log-max-files="5" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746439 4842 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746449 4842 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746458 4842 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746467 4842 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746476 4842 flags.go:64] FLAG: --contention-profiling="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746485 4842 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746494 4842 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746503 4842 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746512 4842 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746523 4842 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746531 4842 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746540 4842 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746549 4842 flags.go:64] FLAG: --enable-load-reader="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746558 4842 flags.go:64] FLAG: --enable-server="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746567 4842 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746579 4842 flags.go:64] FLAG: --event-burst="100" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746588 4842 flags.go:64] FLAG: --event-qps="50" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746598 4842 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746607 4842 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746618 4842 flags.go:64] FLAG: --eviction-hard="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746628 4842 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746637 4842 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746647 4842 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746660 4842 flags.go:64] FLAG: --eviction-soft="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746673 4842 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746685 4842 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746696 4842 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746707 4842 flags.go:64] FLAG: --experimental-mounter-path="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746717 4842 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746727 4842 flags.go:64] FLAG: --fail-swap-on="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746736 4842 flags.go:64] FLAG: --feature-gates="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746747 4842 flags.go:64] FLAG: --file-check-frequency="20s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746756 4842 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746765 4842 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746775 4842 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746784 4842 flags.go:64] FLAG: --healthz-port="10248" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746793 4842 flags.go:64] FLAG: --help="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746802 4842 flags.go:64] FLAG: --hostname-override="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746811 4842 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746820 4842 flags.go:64] FLAG: --http-check-frequency="20s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746829 4842 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746838 4842 flags.go:64] FLAG: --image-credential-provider-config="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746847 4842 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746856 4842 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746868 4842 flags.go:64] FLAG: --image-service-endpoint="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746877 4842 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746886 4842 flags.go:64] FLAG: --kube-api-burst="100" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746895 4842 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746904 4842 flags.go:64] FLAG: --kube-api-qps="50" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746913 4842 flags.go:64] FLAG: --kube-reserved="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746922 4842 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746931 4842 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746941 4842 flags.go:64] FLAG: --kubelet-cgroups="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746949 4842 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746958 4842 flags.go:64] FLAG: --lock-file="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746967 4842 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746976 4842 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746985 4842 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.746998 4842 flags.go:64] FLAG: --log-json-split-stream="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747006 4842 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747016 4842 flags.go:64] FLAG: --log-text-split-stream="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747024 4842 flags.go:64] FLAG: --logging-format="text" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747033 4842 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747042 4842 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747051 4842 flags.go:64] FLAG: --manifest-url="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747059 4842 flags.go:64] FLAG: --manifest-url-header="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747072 4842 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747082 4842 flags.go:64] FLAG: --max-open-files="1000000" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747092 4842 flags.go:64] FLAG: --max-pods="110" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747101 4842 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747110 4842 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747120 4842 flags.go:64] FLAG: --memory-manager-policy="None" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747128 4842 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747138 4842 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747147 4842 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747156 4842 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747176 4842 flags.go:64] FLAG: --node-status-max-images="50" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747185 4842 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747194 4842 flags.go:64] FLAG: --oom-score-adj="-999" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747203 4842 flags.go:64] FLAG: --pod-cidr="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747214 4842 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747230 4842 flags.go:64] FLAG: --pod-manifest-path="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747238 4842 flags.go:64] FLAG: --pod-max-pids="-1" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747248 4842 flags.go:64] FLAG: --pods-per-core="0" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747257 4842 flags.go:64] FLAG: --port="10250" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747266 4842 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747345 4842 flags.go:64] FLAG: --provider-id="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747358 4842 flags.go:64] FLAG: --qos-reserved="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747367 4842 flags.go:64] FLAG: --read-only-port="10255" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747378 4842 flags.go:64] FLAG: --register-node="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747387 4842 flags.go:64] FLAG: --register-schedulable="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747396 4842 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747411 4842 flags.go:64] FLAG: --registry-burst="10" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747420 4842 flags.go:64] FLAG: --registry-qps="5" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747428 4842 flags.go:64] FLAG: --reserved-cpus="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747437 4842 flags.go:64] FLAG: --reserved-memory="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747449 4842 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747457 4842 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747466 4842 flags.go:64] FLAG: --rotate-certificates="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747475 4842 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747484 4842 flags.go:64] FLAG: --runonce="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747492 4842 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747502 4842 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747513 4842 flags.go:64] FLAG: --seccomp-default="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747522 4842 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747531 4842 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747540 4842 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747549 4842 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747558 4842 flags.go:64] FLAG: --storage-driver-password="root" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747567 4842 flags.go:64] FLAG: --storage-driver-secure="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747575 4842 flags.go:64] FLAG: --storage-driver-table="stats" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747584 4842 flags.go:64] FLAG: --storage-driver-user="root" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747593 4842 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747602 4842 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747611 4842 flags.go:64] FLAG: --system-cgroups="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747620 4842 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747642 4842 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747655 4842 flags.go:64] FLAG: --tls-cert-file="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747667 4842 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747681 4842 flags.go:64] FLAG: --tls-min-version="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747692 4842 flags.go:64] FLAG: --tls-private-key-file="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747703 4842 flags.go:64] FLAG: --topology-manager-policy="none" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747715 4842 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747724 4842 flags.go:64] FLAG: --topology-manager-scope="container" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747733 4842 flags.go:64] FLAG: --v="2" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747744 4842 flags.go:64] FLAG: --version="false" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747755 4842 flags.go:64] FLAG: --vmodule="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747766 4842 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.747776 4842 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.747987 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.747999 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748009 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748017 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748026 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748033 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748041 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748049 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748057 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748065 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748072 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748079 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748087 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748094 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748102 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748110 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748118 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748126 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748133 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748140 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748148 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748156 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748164 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748172 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748180 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748188 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748196 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748204 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748212 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748219 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748227 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748235 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748242 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748250 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748257 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748265 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748315 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748325 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748334 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748342 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748350 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748358 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748365 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748373 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748383 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748392 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748402 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748412 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748421 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748430 4842 feature_gate.go:330] unrecognized feature gate: Example Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748438 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748446 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748454 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748462 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748470 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748477 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748486 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748494 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748502 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748511 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748519 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748527 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748535 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748542 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748550 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748558 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748565 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748573 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748581 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748588 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.748596 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.748623 4842 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.759259 4842 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.759343 4842 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759456 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759467 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759472 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759477 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759481 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759486 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759490 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759495 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759499 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759503 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759506 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759512 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759520 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759524 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759528 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759532 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759536 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759540 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759545 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759550 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759555 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759560 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759565 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759571 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759577 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759584 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759589 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759594 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759598 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759603 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759607 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759611 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759614 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759618 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759624 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759628 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759632 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759636 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759641 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759645 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759650 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759655 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759660 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759665 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759670 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759675 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759680 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759686 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759690 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759695 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759704 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759709 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759718 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759724 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759729 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759734 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759739 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759744 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759749 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759753 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759759 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759765 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759769 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759774 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759778 4842 feature_gate.go:330] unrecognized feature gate: Example Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759783 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759788 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759792 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759796 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759800 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759813 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.759822 4842 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759974 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759982 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759989 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759994 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.759999 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760003 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760007 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760011 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760015 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760018 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760023 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760026 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760030 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760035 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760039 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760043 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760046 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760050 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760054 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760058 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760061 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760065 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760069 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760073 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760076 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760080 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760084 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760087 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760091 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760095 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760098 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760102 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760106 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760109 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760115 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760120 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760124 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760128 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760132 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760136 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760140 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760144 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760148 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760152 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760158 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760162 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760166 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760170 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760174 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760178 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760182 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760186 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760190 4842 feature_gate.go:330] unrecognized feature gate: Example Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760194 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760198 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760204 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760211 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760217 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760222 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760227 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760232 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760237 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760242 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760247 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760252 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760257 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760261 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760266 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760276 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760302 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.760308 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.760317 4842 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.761293 4842 server.go:940] "Client rotation is on, will bootstrap in background" Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.772109 4842 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.776997 4842 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.777102 4842 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.778661 4842 server.go:997] "Starting client certificate rotation" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.778695 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.778872 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.801204 4842 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.804618 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.805322 4842 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.815043 4842 log.go:25] "Validated CRI v1 runtime API" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.850582 4842 log.go:25] "Validated CRI v1 image API" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.852535 4842 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.856639 4842 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-11-18-45-01-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.856681 4842 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.875029 4842 manager.go:217] Machine: {Timestamp:2026-03-11 18:49:14.871848545 +0000 UTC m=+0.519544865 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4a5eeb05-4676-462d-b71e-ee04d871eea1 BootID:16dedd3d-ff19-42b0-bfef-c82bb1fa68db Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5b:63:0f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5b:63:0f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e5:9b:15 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d8:15:a3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6b:d0:23 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b3:48:1d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:82:11:71:21:dc Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:58:26:d2:ab:b6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.875304 4842 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.875430 4842 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.879053 4842 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.879678 4842 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.879746 4842 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.880015 4842 topology_manager.go:138] "Creating topology manager with none policy" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.880036 4842 container_manager_linux.go:303] "Creating device plugin manager" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.880430 4842 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.881027 4842 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.881263 4842 state_mem.go:36] "Initialized new in-memory state store" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.881416 4842 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.884599 4842 kubelet.go:418] "Attempting to sync node with API server" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.884658 4842 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.884706 4842 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.884727 4842 kubelet.go:324] "Adding apiserver pod source" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.884745 4842 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.888895 4842 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.889810 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.889881 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.889814 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.889937 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.890299 4842 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.894869 4842 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896166 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896192 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896203 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896214 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896230 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896239 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896248 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896263 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896293 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896303 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896324 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.896334 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.897231 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.897698 4842 server.go:1280] "Started kubelet" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.898782 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:14 crc systemd[1]: Started Kubernetes Kubelet. Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.899180 4842 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.899940 4842 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.899980 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.900007 4842 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.900232 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.900298 4842 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.900305 4842 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.900396 4842 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.899214 4842 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.901798 4842 factory.go:55] Registering systemd factory Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.901824 4842 factory.go:221] Registration of the systemd container factory successfully Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.902158 4842 factory.go:153] Registering CRI-O factory Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.902203 4842 factory.go:221] Registration of the crio container factory successfully Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.902167 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.902374 4842 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.902442 4842 factory.go:103] Registering Raw factory Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.902421 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.902490 4842 manager.go:1196] Started watching for new ooms in manager Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.903050 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="200ms" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.904582 4842 manager.go:319] Starting recovery of all containers Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.905657 4842 server.go:460] "Adding debug handlers to kubelet server" Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.911126 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.251:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189bddfc82e06cfc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,LastTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918207 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918301 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918324 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918342 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918358 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918376 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918393 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918408 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918426 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918439 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918455 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918468 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918485 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918504 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918521 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918537 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918552 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918567 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918582 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918595 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918608 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918622 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918642 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918658 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918673 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918688 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918707 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918722 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918737 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918753 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918769 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918787 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918802 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918816 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918831 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918845 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918862 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918877 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918908 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918924 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918941 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918957 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918973 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.918999 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919013 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919028 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919046 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919062 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919078 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919093 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919108 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919122 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919142 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919163 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919181 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919198 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919214 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919230 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919244 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919257 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919275 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919317 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919333 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919350 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919365 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919382 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919396 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919414 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919429 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919443 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919458 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919476 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919493 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919512 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919531 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919553 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919570 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919613 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.919629 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.922522 4842 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.922599 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.922627 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923385 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923427 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923442 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923460 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923475 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923487 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923500 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923513 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923527 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923543 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923556 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923570 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923584 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923597 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923611 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923623 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923638 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923653 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923668 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923680 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923694 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923707 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923722 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923751 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923774 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923789 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923802 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923815 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923829 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923843 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923857 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923872 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923886 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923900 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923912 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923924 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923937 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923950 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923963 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923976 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.923989 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924004 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924016 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924034 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924047 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924058 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924072 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924083 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924096 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924108 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924119 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924132 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924151 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924162 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924175 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924187 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924199 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924209 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924221 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924231 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924244 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924255 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924269 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924303 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924314 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924327 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924340 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924357 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924369 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924381 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924393 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924405 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924417 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924431 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924444 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924461 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924474 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924486 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924501 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924518 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924533 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924547 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924562 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924578 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924593 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924608 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924622 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924640 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924685 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924701 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924718 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924736 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924753 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924769 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924784 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924799 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924811 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924824 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924837 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924853 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924865 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924879 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924892 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924904 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924918 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924931 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924946 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924958 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924971 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924984 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.924996 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925009 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925054 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925069 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925085 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925100 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925113 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925127 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925140 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925154 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925167 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925182 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925193 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925207 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925219 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925232 4842 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925244 4842 reconstruct.go:97] "Volume reconstruction finished" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.925254 4842 reconciler.go:26] "Reconciler: start to sync state" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.937072 4842 manager.go:324] Recovery completed Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.948189 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.949921 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.949985 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.949998 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.951170 4842 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.951195 4842 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.951238 4842 state_mem.go:36] "Initialized new in-memory state store" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.959218 4842 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.960830 4842 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.960869 4842 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.960894 4842 kubelet.go:2335] "Starting kubelet main sync loop" Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.960992 4842 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 11 18:49:14 crc kubenswrapper[4842]: W0311 18:49:14.962483 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:14 crc kubenswrapper[4842]: E0311 18:49:14.962540 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.972140 4842 policy_none.go:49] "None policy: Start" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.972993 4842 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 11 18:49:14 crc kubenswrapper[4842]: I0311 18:49:14.973028 4842 state_mem.go:35] "Initializing new in-memory state store" Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.001618 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.021237 4842 manager.go:334] "Starting Device Plugin manager" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.021297 4842 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.021310 4842 server.go:79] "Starting device plugin registration server" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.021749 4842 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.021763 4842 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.022029 4842 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.022096 4842 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.022103 4842 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.031145 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.061853 4842 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.061986 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063044 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063075 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063085 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063225 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063524 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063577 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063943 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063962 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.063970 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064064 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064225 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064250 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064773 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064814 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064825 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064783 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064898 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.064909 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.065195 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.065232 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.065257 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.065306 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.065787 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.065819 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.068776 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.068875 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.068930 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.069111 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.069176 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.069233 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.069418 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.069563 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.069634 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.070598 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.070598 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.070802 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.070880 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.070705 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.071012 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.071119 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.071189 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.072010 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.072054 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.072068 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.104540 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="400ms" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.122586 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.123862 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.123917 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.123930 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.123959 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.124481 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.126883 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.126907 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.126930 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.126946 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.126968 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.126987 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127026 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127068 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127109 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127156 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127172 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127205 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127235 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127265 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.127312 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.228967 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229400 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229431 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229454 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229482 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229505 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229533 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229556 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229581 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229605 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229628 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229652 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229677 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229699 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229734 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.229152 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230185 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230218 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230301 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230311 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230343 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230363 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230384 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230390 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230423 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230424 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230427 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230453 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230454 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.230455 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.324856 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.326736 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.326796 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.326820 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.326861 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.327553 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.397362 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.421931 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.428129 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: W0311 18:49:15.442010 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d05d7c644ef09a2f45f64ee17c7f9b6d5da88a3053874cc5dc20fbc9fcbcbe2a WatchSource:0}: Error finding container d05d7c644ef09a2f45f64ee17c7f9b6d5da88a3053874cc5dc20fbc9fcbcbe2a: Status 404 returned error can't find the container with id d05d7c644ef09a2f45f64ee17c7f9b6d5da88a3053874cc5dc20fbc9fcbcbe2a Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.451808 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.455352 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:15 crc kubenswrapper[4842]: W0311 18:49:15.460313 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-88cbd46557050f20c5d9d813844ef01358516c24193e96de9013656c05e575f0 WatchSource:0}: Error finding container 88cbd46557050f20c5d9d813844ef01358516c24193e96de9013656c05e575f0: Status 404 returned error can't find the container with id 88cbd46557050f20c5d9d813844ef01358516c24193e96de9013656c05e575f0 Mar 11 18:49:15 crc kubenswrapper[4842]: W0311 18:49:15.463238 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cbe9fe886b7fc52a4d9241ec5eda19fcaaac9f6e1356733270eab204c0384a4e WatchSource:0}: Error finding container cbe9fe886b7fc52a4d9241ec5eda19fcaaac9f6e1356733270eab204c0384a4e: Status 404 returned error can't find the container with id cbe9fe886b7fc52a4d9241ec5eda19fcaaac9f6e1356733270eab204c0384a4e Mar 11 18:49:15 crc kubenswrapper[4842]: W0311 18:49:15.470338 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-20929571a21e469d86f2b1a28f49448314c33f1950a94dd7084ba22f642e6502 WatchSource:0}: Error finding container 20929571a21e469d86f2b1a28f49448314c33f1950a94dd7084ba22f642e6502: Status 404 returned error can't find the container with id 20929571a21e469d86f2b1a28f49448314c33f1950a94dd7084ba22f642e6502 Mar 11 18:49:15 crc kubenswrapper[4842]: W0311 18:49:15.476378 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ff0ff9131b1b7f51ec99acbf0ba9721bb21b1ffaeb92bbbeccb848a87066aebf WatchSource:0}: Error finding container ff0ff9131b1b7f51ec99acbf0ba9721bb21b1ffaeb92bbbeccb848a87066aebf: Status 404 returned error can't find the container with id ff0ff9131b1b7f51ec99acbf0ba9721bb21b1ffaeb92bbbeccb848a87066aebf Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.505453 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="800ms" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.728506 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.729580 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.729627 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.729639 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.729671 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.730144 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.900502 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.965754 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"88cbd46557050f20c5d9d813844ef01358516c24193e96de9013656c05e575f0"} Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.966886 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d05d7c644ef09a2f45f64ee17c7f9b6d5da88a3053874cc5dc20fbc9fcbcbe2a"} Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.968091 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ff0ff9131b1b7f51ec99acbf0ba9721bb21b1ffaeb92bbbeccb848a87066aebf"} Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.969426 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"20929571a21e469d86f2b1a28f49448314c33f1950a94dd7084ba22f642e6502"} Mar 11 18:49:15 crc kubenswrapper[4842]: I0311 18:49:15.970795 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cbe9fe886b7fc52a4d9241ec5eda19fcaaac9f6e1356733270eab204c0384a4e"} Mar 11 18:49:15 crc kubenswrapper[4842]: W0311 18:49:15.976567 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:15 crc kubenswrapper[4842]: E0311 18:49:15.976649 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:16 crc kubenswrapper[4842]: W0311 18:49:16.130400 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:16 crc kubenswrapper[4842]: E0311 18:49:16.130934 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:16 crc kubenswrapper[4842]: W0311 18:49:16.296488 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:16 crc kubenswrapper[4842]: E0311 18:49:16.296609 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:16 crc kubenswrapper[4842]: E0311 18:49:16.306666 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="1.6s" Mar 11 18:49:16 crc kubenswrapper[4842]: W0311 18:49:16.371215 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:16 crc kubenswrapper[4842]: E0311 18:49:16.371343 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.530637 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.533027 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.533081 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.533095 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.533135 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:16 crc kubenswrapper[4842]: E0311 18:49:16.533791 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.899534 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.965719 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 18:49:16 crc kubenswrapper[4842]: E0311 18:49:16.966683 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.977522 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4ac00bdbaf4507366bedd6e12748fa4e54c702e1cc9de8e0f712b55b486421f"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.977627 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.977713 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d26431d830e73b55e120456a559859496fc84ac10406cf27322cb1a98e8b7b56"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.977729 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7edae5b9b939e5967612c927811c66aa016d1528211dbec811920a99b1037acb"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.977740 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.978675 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.978710 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.978723 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.979529 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be" exitCode=0 Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.979598 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.979607 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.980191 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.980216 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.980225 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.981365 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.982212 4842 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c" exitCode=0 Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.982238 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.982328 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.983071 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.983097 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.983108 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.983095 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.983227 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.983261 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.984924 4842 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740" exitCode=0 Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.984964 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.985046 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.986140 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.986161 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.986172 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.988244 4842 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489" exitCode=0 Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.988306 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489"} Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.988360 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.990155 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.990188 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:16 crc kubenswrapper[4842]: I0311 18:49:16.990204 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.900487 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:17 crc kubenswrapper[4842]: E0311 18:49:17.907880 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="3.2s" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.994026 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a80fb37278499c15aa38bdbb1aff780e73d9341d802829c7b3e02afffa5f1498"} Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.994082 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541"} Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.994120 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a"} Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.994131 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301"} Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.994141 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04"} Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.994189 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.995480 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.995548 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.995564 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.996842 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a"} Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.996900 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.997941 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.997988 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.998010 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.999073 4842 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f" exitCode=0 Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.999207 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:17 crc kubenswrapper[4842]: I0311 18:49:17.999218 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f"} Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.000423 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.000496 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.000518 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.001751 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.001747 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2"} Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.001797 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59"} Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.001828 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c"} Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.001956 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.002402 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.002430 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.002440 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.003179 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.003198 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.003206 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.134386 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.136200 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.136243 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.136253 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.136296 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:18 crc kubenswrapper[4842]: E0311 18:49:18.136959 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.251:6443: connect: connection refused" node="crc" Mar 11 18:49:18 crc kubenswrapper[4842]: W0311 18:49:18.162967 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:18 crc kubenswrapper[4842]: E0311 18:49:18.163123 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:18 crc kubenswrapper[4842]: W0311 18:49:18.186906 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.251:6443: connect: connection refused Mar 11 18:49:18 crc kubenswrapper[4842]: E0311 18:49:18.187031 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.251:6443: connect: connection refused" logger="UnhandledError" Mar 11 18:49:18 crc kubenswrapper[4842]: I0311 18:49:18.813669 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.007225 4842 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326" exitCode=0 Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.007348 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326"} Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.007433 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.007603 4842 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.007677 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.008503 4842 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.008562 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.008848 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009543 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009578 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009591 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009612 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009653 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009675 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009714 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009743 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.009761 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.010736 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.010899 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:19 crc kubenswrapper[4842]: I0311 18:49:19.011018 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.017996 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3"} Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.018073 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3"} Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.018090 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128"} Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.018105 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483"} Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.018169 4842 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.018324 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.019710 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.019776 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.019802 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.290516 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.290894 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.293905 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.293961 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.293981 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:20 crc kubenswrapper[4842]: I0311 18:49:20.993970 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.026196 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11"} Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.026493 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.027755 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.027815 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.027832 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.338106 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.339505 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.339567 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.339599 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.339663 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.697693 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.964700 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.964921 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.966036 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.966076 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:21 crc kubenswrapper[4842]: I0311 18:49:21.966085 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.027979 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.028685 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.028718 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.028730 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.058399 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.058511 4842 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.058538 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.059478 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.059518 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.059527 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.785230 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.785440 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.787049 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.787091 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.787100 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.793130 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:22 crc kubenswrapper[4842]: I0311 18:49:22.830975 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.030263 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.030298 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.030326 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.031207 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.031215 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.031238 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.031242 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.031248 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:23 crc kubenswrapper[4842]: I0311 18:49:23.031253 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.032079 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.033098 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.033133 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.033147 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.654011 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.654424 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.656729 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.656806 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.656827 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.965303 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:49:24 crc kubenswrapper[4842]: I0311 18:49:24.965398 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 18:49:25 crc kubenswrapper[4842]: E0311 18:49:25.031342 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:49:25 crc kubenswrapper[4842]: I0311 18:49:25.234998 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:25 crc kubenswrapper[4842]: I0311 18:49:25.235365 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:25 crc kubenswrapper[4842]: I0311 18:49:25.237386 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:25 crc kubenswrapper[4842]: I0311 18:49:25.237477 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:25 crc kubenswrapper[4842]: I0311 18:49:25.237504 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:26 crc kubenswrapper[4842]: I0311 18:49:26.738679 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:26 crc kubenswrapper[4842]: I0311 18:49:26.738845 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:26 crc kubenswrapper[4842]: I0311 18:49:26.740304 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:26 crc kubenswrapper[4842]: I0311 18:49:26.740376 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:26 crc kubenswrapper[4842]: I0311 18:49:26.740394 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:28 crc kubenswrapper[4842]: I0311 18:49:28.900553 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 11 18:49:29 crc kubenswrapper[4842]: W0311 18:49:29.111559 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.111918 4842 trace.go:236] Trace[1197832777]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Mar-2026 18:49:19.109) (total time: 10002ms): Mar 11 18:49:29 crc kubenswrapper[4842]: Trace[1197832777]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:49:29.111) Mar 11 18:49:29 crc kubenswrapper[4842]: Trace[1197832777]: [10.002296151s] [10.002296151s] END Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.111938 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 11 18:49:29 crc kubenswrapper[4842]: W0311 18:49:29.121545 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.121653 4842 trace.go:236] Trace[417237858]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Mar-2026 18:49:19.119) (total time: 10001ms): Mar 11 18:49:29 crc kubenswrapper[4842]: Trace[417237858]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:49:29.121) Mar 11 18:49:29 crc kubenswrapper[4842]: Trace[417237858]: [10.001964376s] [10.001964376s] END Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.121678 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 11 18:49:29 crc kubenswrapper[4842]: W0311 18:49:29.476402 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.476502 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.479206 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.479411 4842 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.479447 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.479592 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 11 18:49:29 crc kubenswrapper[4842]: W0311 18:49:29.480566 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.480633 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.481135 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 18:49:29 crc kubenswrapper[4842]: E0311 18:49:29.483980 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bddfc82e06cfc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,LastTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.487298 4842 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.487341 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.902651 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:29Z is after 2026-02-23T05:33:13Z Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.911347 4842 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 11 18:49:29 crc kubenswrapper[4842]: I0311 18:49:29.911409 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.049478 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.051421 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a80fb37278499c15aa38bdbb1aff780e73d9341d802829c7b3e02afffa5f1498" exitCode=255 Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.051469 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a80fb37278499c15aa38bdbb1aff780e73d9341d802829c7b3e02afffa5f1498"} Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.051638 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.052989 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.053025 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.053035 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.053590 4842 scope.go:117] "RemoveContainer" containerID="a80fb37278499c15aa38bdbb1aff780e73d9341d802829c7b3e02afffa5f1498" Mar 11 18:49:30 crc kubenswrapper[4842]: I0311 18:49:30.910597 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:30Z is after 2026-02-23T05:33:13Z Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.057021 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.057542 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.059449 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" exitCode=255 Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.059520 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605"} Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.059600 4842 scope.go:117] "RemoveContainer" containerID="a80fb37278499c15aa38bdbb1aff780e73d9341d802829c7b3e02afffa5f1498" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.059873 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.061000 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.061032 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.061045 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.061616 4842 scope.go:117] "RemoveContainer" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" Mar 11 18:49:31 crc kubenswrapper[4842]: E0311 18:49:31.061820 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:49:31 crc kubenswrapper[4842]: I0311 18:49:31.906634 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:31Z is after 2026-02-23T05:33:13Z Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.064565 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.072190 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.072445 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.074188 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.074316 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.074343 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.075418 4842 scope.go:117] "RemoveContainer" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" Mar 11 18:49:32 crc kubenswrapper[4842]: E0311 18:49:32.075838 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.082652 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:32 crc kubenswrapper[4842]: I0311 18:49:32.902966 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:32Z is after 2026-02-23T05:33:13Z Mar 11 18:49:33 crc kubenswrapper[4842]: I0311 18:49:33.069006 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:33 crc kubenswrapper[4842]: I0311 18:49:33.071078 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:33 crc kubenswrapper[4842]: I0311 18:49:33.071135 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:33 crc kubenswrapper[4842]: I0311 18:49:33.071151 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:33 crc kubenswrapper[4842]: I0311 18:49:33.071919 4842 scope.go:117] "RemoveContainer" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" Mar 11 18:49:33 crc kubenswrapper[4842]: E0311 18:49:33.072129 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:49:33 crc kubenswrapper[4842]: W0311 18:49:33.499722 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:33Z is after 2026-02-23T05:33:13Z Mar 11 18:49:33 crc kubenswrapper[4842]: E0311 18:49:33.499786 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:33 crc kubenswrapper[4842]: I0311 18:49:33.905853 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:33Z is after 2026-02-23T05:33:13Z Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.690857 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.691132 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.692941 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.693068 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.693106 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.709599 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.902413 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:34Z is after 2026-02-23T05:33:13Z Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.965652 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:49:34 crc kubenswrapper[4842]: I0311 18:49:34.965774 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 18:49:35 crc kubenswrapper[4842]: E0311 18:49:35.031558 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.076363 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.077554 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.077597 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.077604 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.235455 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.235687 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.237501 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.237620 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.237657 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.238594 4842 scope.go:117] "RemoveContainer" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" Mar 11 18:49:35 crc kubenswrapper[4842]: E0311 18:49:35.238895 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:49:35 crc kubenswrapper[4842]: W0311 18:49:35.321409 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:35Z is after 2026-02-23T05:33:13Z Mar 11 18:49:35 crc kubenswrapper[4842]: E0311 18:49:35.321486 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.881791 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:35 crc kubenswrapper[4842]: E0311 18:49:35.883022 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.883516 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.883612 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.883625 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.883647 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:35 crc kubenswrapper[4842]: E0311 18:49:35.887122 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 18:49:35 crc kubenswrapper[4842]: I0311 18:49:35.903610 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:35Z is after 2026-02-23T05:33:13Z Mar 11 18:49:36 crc kubenswrapper[4842]: I0311 18:49:36.905193 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:36Z is after 2026-02-23T05:33:13Z Mar 11 18:49:36 crc kubenswrapper[4842]: W0311 18:49:36.964051 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:36Z is after 2026-02-23T05:33:13Z Mar 11 18:49:36 crc kubenswrapper[4842]: E0311 18:49:36.964120 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:37 crc kubenswrapper[4842]: I0311 18:49:37.902142 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:37Z is after 2026-02-23T05:33:13Z Mar 11 18:49:37 crc kubenswrapper[4842]: I0311 18:49:37.949097 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 18:49:37 crc kubenswrapper[4842]: E0311 18:49:37.952934 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:37 crc kubenswrapper[4842]: W0311 18:49:37.999805 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:37Z is after 2026-02-23T05:33:13Z Mar 11 18:49:37 crc kubenswrapper[4842]: E0311 18:49:37.999883 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:38 crc kubenswrapper[4842]: I0311 18:49:38.903247 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:38Z is after 2026-02-23T05:33:13Z Mar 11 18:49:39 crc kubenswrapper[4842]: E0311 18:49:39.487874 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bddfc82e06cfc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,LastTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:49:39 crc kubenswrapper[4842]: I0311 18:49:39.902438 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:39Z is after 2026-02-23T05:33:13Z Mar 11 18:49:39 crc kubenswrapper[4842]: I0311 18:49:39.910498 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:39 crc kubenswrapper[4842]: I0311 18:49:39.910667 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:39 crc kubenswrapper[4842]: I0311 18:49:39.911580 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:39 crc kubenswrapper[4842]: I0311 18:49:39.911622 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:39 crc kubenswrapper[4842]: I0311 18:49:39.911633 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:39 crc kubenswrapper[4842]: I0311 18:49:39.912869 4842 scope.go:117] "RemoveContainer" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" Mar 11 18:49:39 crc kubenswrapper[4842]: E0311 18:49:39.913165 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:49:40 crc kubenswrapper[4842]: W0311 18:49:40.196155 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:40Z is after 2026-02-23T05:33:13Z Mar 11 18:49:40 crc kubenswrapper[4842]: E0311 18:49:40.196291 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:40 crc kubenswrapper[4842]: I0311 18:49:40.903463 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:40Z is after 2026-02-23T05:33:13Z Mar 11 18:49:41 crc kubenswrapper[4842]: I0311 18:49:41.902211 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:41Z is after 2026-02-23T05:33:13Z Mar 11 18:49:42 crc kubenswrapper[4842]: W0311 18:49:42.506341 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:42Z is after 2026-02-23T05:33:13Z Mar 11 18:49:42 crc kubenswrapper[4842]: E0311 18:49:42.506422 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:42 crc kubenswrapper[4842]: I0311 18:49:42.887599 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:42 crc kubenswrapper[4842]: E0311 18:49:42.887702 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 18:49:42 crc kubenswrapper[4842]: I0311 18:49:42.888714 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:42 crc kubenswrapper[4842]: I0311 18:49:42.888744 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:42 crc kubenswrapper[4842]: I0311 18:49:42.888753 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:42 crc kubenswrapper[4842]: I0311 18:49:42.888772 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:42 crc kubenswrapper[4842]: E0311 18:49:42.893310 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 18:49:42 crc kubenswrapper[4842]: I0311 18:49:42.907247 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:42Z is after 2026-02-23T05:33:13Z Mar 11 18:49:43 crc kubenswrapper[4842]: I0311 18:49:43.902745 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:43Z is after 2026-02-23T05:33:13Z Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.905513 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:44Z is after 2026-02-23T05:33:13Z Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.965410 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.965523 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.965618 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.965875 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.967821 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.967874 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.967894 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.968808 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"7edae5b9b939e5967612c927811c66aa016d1528211dbec811920a99b1037acb"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 11 18:49:44 crc kubenswrapper[4842]: I0311 18:49:44.969079 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://7edae5b9b939e5967612c927811c66aa016d1528211dbec811920a99b1037acb" gracePeriod=30 Mar 11 18:49:45 crc kubenswrapper[4842]: E0311 18:49:45.031770 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:49:45 crc kubenswrapper[4842]: I0311 18:49:45.111965 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 18:49:45 crc kubenswrapper[4842]: I0311 18:49:45.112491 4842 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7edae5b9b939e5967612c927811c66aa016d1528211dbec811920a99b1037acb" exitCode=255 Mar 11 18:49:45 crc kubenswrapper[4842]: I0311 18:49:45.112540 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7edae5b9b939e5967612c927811c66aa016d1528211dbec811920a99b1037acb"} Mar 11 18:49:45 crc kubenswrapper[4842]: I0311 18:49:45.902616 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:45Z is after 2026-02-23T05:33:13Z Mar 11 18:49:46 crc kubenswrapper[4842]: I0311 18:49:46.119449 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 18:49:46 crc kubenswrapper[4842]: I0311 18:49:46.119788 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049"} Mar 11 18:49:46 crc kubenswrapper[4842]: I0311 18:49:46.119934 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:46 crc kubenswrapper[4842]: I0311 18:49:46.121064 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:46 crc kubenswrapper[4842]: I0311 18:49:46.121120 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:46 crc kubenswrapper[4842]: I0311 18:49:46.121142 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:46 crc kubenswrapper[4842]: I0311 18:49:46.902921 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:46Z is after 2026-02-23T05:33:13Z Mar 11 18:49:47 crc kubenswrapper[4842]: I0311 18:49:47.122514 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:47 crc kubenswrapper[4842]: I0311 18:49:47.123422 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:47 crc kubenswrapper[4842]: I0311 18:49:47.123487 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:47 crc kubenswrapper[4842]: I0311 18:49:47.123511 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:47 crc kubenswrapper[4842]: I0311 18:49:47.905141 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:47Z is after 2026-02-23T05:33:13Z Mar 11 18:49:48 crc kubenswrapper[4842]: I0311 18:49:48.905779 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:48Z is after 2026-02-23T05:33:13Z Mar 11 18:49:49 crc kubenswrapper[4842]: E0311 18:49:49.493171 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:49Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bddfc82e06cfc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,LastTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:49:49 crc kubenswrapper[4842]: E0311 18:49:49.890999 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:49Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 18:49:49 crc kubenswrapper[4842]: I0311 18:49:49.894119 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:49 crc kubenswrapper[4842]: I0311 18:49:49.895187 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:49 crc kubenswrapper[4842]: I0311 18:49:49.895224 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:49 crc kubenswrapper[4842]: I0311 18:49:49.895232 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:49 crc kubenswrapper[4842]: I0311 18:49:49.895253 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:49 crc kubenswrapper[4842]: E0311 18:49:49.898428 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 18:49:49 crc kubenswrapper[4842]: I0311 18:49:49.901943 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:49Z is after 2026-02-23T05:33:13Z Mar 11 18:49:50 crc kubenswrapper[4842]: I0311 18:49:50.902818 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:50Z is after 2026-02-23T05:33:13Z Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.905830 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:51Z is after 2026-02-23T05:33:13Z Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.962210 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.964476 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.964563 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.964589 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.965060 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.965324 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.965858 4842 scope.go:117] "RemoveContainer" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.966838 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.966925 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:51 crc kubenswrapper[4842]: I0311 18:49:51.966947 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:52 crc kubenswrapper[4842]: I0311 18:49:52.832011 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:49:52 crc kubenswrapper[4842]: I0311 18:49:52.832152 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:52 crc kubenswrapper[4842]: I0311 18:49:52.833111 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:52 crc kubenswrapper[4842]: I0311 18:49:52.833194 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:52 crc kubenswrapper[4842]: I0311 18:49:52.833211 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:52 crc kubenswrapper[4842]: I0311 18:49:52.902514 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:52Z is after 2026-02-23T05:33:13Z Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.140636 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.142341 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.146466 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0656ddd427d0fd89eca467ecd52f2add48519de7dd5601d0df316b647e653110" exitCode=255 Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.146554 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0656ddd427d0fd89eca467ecd52f2add48519de7dd5601d0df316b647e653110"} Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.146605 4842 scope.go:117] "RemoveContainer" containerID="9969ba8f42ac0d51262a2a76d4bf9f87eff2e70d4af9b76d02eb297c03b5c605" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.146962 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.149633 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.149667 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.149680 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.150173 4842 scope.go:117] "RemoveContainer" containerID="0656ddd427d0fd89eca467ecd52f2add48519de7dd5601d0df316b647e653110" Mar 11 18:49:53 crc kubenswrapper[4842]: E0311 18:49:53.150382 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:49:53 crc kubenswrapper[4842]: W0311 18:49:53.335300 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:53Z is after 2026-02-23T05:33:13Z Mar 11 18:49:53 crc kubenswrapper[4842]: E0311 18:49:53.335442 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:53 crc kubenswrapper[4842]: I0311 18:49:53.902787 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:53Z is after 2026-02-23T05:33:13Z Mar 11 18:49:54 crc kubenswrapper[4842]: I0311 18:49:54.152024 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 18:49:54 crc kubenswrapper[4842]: I0311 18:49:54.711260 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 18:49:54 crc kubenswrapper[4842]: E0311 18:49:54.716012 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:54 crc kubenswrapper[4842]: E0311 18:49:54.717250 4842 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 11 18:49:54 crc kubenswrapper[4842]: I0311 18:49:54.904394 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:54Z is after 2026-02-23T05:33:13Z Mar 11 18:49:54 crc kubenswrapper[4842]: I0311 18:49:54.965231 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:49:54 crc kubenswrapper[4842]: I0311 18:49:54.965414 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 18:49:55 crc kubenswrapper[4842]: E0311 18:49:55.032889 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:49:55 crc kubenswrapper[4842]: I0311 18:49:55.235654 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:55 crc kubenswrapper[4842]: I0311 18:49:55.235974 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:55 crc kubenswrapper[4842]: I0311 18:49:55.237696 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:55 crc kubenswrapper[4842]: I0311 18:49:55.237789 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:55 crc kubenswrapper[4842]: I0311 18:49:55.237840 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:55 crc kubenswrapper[4842]: I0311 18:49:55.238815 4842 scope.go:117] "RemoveContainer" containerID="0656ddd427d0fd89eca467ecd52f2add48519de7dd5601d0df316b647e653110" Mar 11 18:49:55 crc kubenswrapper[4842]: E0311 18:49:55.239134 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:49:55 crc kubenswrapper[4842]: I0311 18:49:55.902710 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:55Z is after 2026-02-23T05:33:13Z Mar 11 18:49:56 crc kubenswrapper[4842]: W0311 18:49:56.762500 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:56Z is after 2026-02-23T05:33:13Z Mar 11 18:49:56 crc kubenswrapper[4842]: E0311 18:49:56.762597 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:49:56 crc kubenswrapper[4842]: E0311 18:49:56.897555 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 18:49:56 crc kubenswrapper[4842]: I0311 18:49:56.899196 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:56 crc kubenswrapper[4842]: I0311 18:49:56.901135 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:56 crc kubenswrapper[4842]: I0311 18:49:56.901186 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:56 crc kubenswrapper[4842]: I0311 18:49:56.901203 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:56 crc kubenswrapper[4842]: I0311 18:49:56.901235 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:49:56 crc kubenswrapper[4842]: I0311 18:49:56.902913 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:56Z is after 2026-02-23T05:33:13Z Mar 11 18:49:56 crc kubenswrapper[4842]: E0311 18:49:56.905151 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 18:49:57 crc kubenswrapper[4842]: I0311 18:49:57.902062 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:57Z is after 2026-02-23T05:33:13Z Mar 11 18:49:58 crc kubenswrapper[4842]: I0311 18:49:58.904469 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:58Z is after 2026-02-23T05:33:13Z Mar 11 18:49:59 crc kubenswrapper[4842]: E0311 18:49:59.496758 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bddfc82e06cfc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,LastTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:49:59 crc kubenswrapper[4842]: I0311 18:49:59.904100 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:49:59Z is after 2026-02-23T05:33:13Z Mar 11 18:49:59 crc kubenswrapper[4842]: I0311 18:49:59.911414 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:49:59 crc kubenswrapper[4842]: I0311 18:49:59.911644 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:49:59 crc kubenswrapper[4842]: I0311 18:49:59.913230 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:49:59 crc kubenswrapper[4842]: I0311 18:49:59.913336 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:49:59 crc kubenswrapper[4842]: I0311 18:49:59.913359 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:49:59 crc kubenswrapper[4842]: I0311 18:49:59.914611 4842 scope.go:117] "RemoveContainer" containerID="0656ddd427d0fd89eca467ecd52f2add48519de7dd5601d0df316b647e653110" Mar 11 18:49:59 crc kubenswrapper[4842]: E0311 18:49:59.914887 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:50:00 crc kubenswrapper[4842]: I0311 18:50:00.903008 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:00Z is after 2026-02-23T05:33:13Z Mar 11 18:50:01 crc kubenswrapper[4842]: W0311 18:50:01.629092 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:01Z is after 2026-02-23T05:33:13Z Mar 11 18:50:01 crc kubenswrapper[4842]: E0311 18:50:01.629231 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 18:50:01 crc kubenswrapper[4842]: I0311 18:50:01.902700 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:01Z is after 2026-02-23T05:33:13Z Mar 11 18:50:02 crc kubenswrapper[4842]: I0311 18:50:02.902392 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:02Z is after 2026-02-23T05:33:13Z Mar 11 18:50:03 crc kubenswrapper[4842]: I0311 18:50:03.905778 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:03 crc kubenswrapper[4842]: I0311 18:50:03.906808 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:03 crc kubenswrapper[4842]: E0311 18:50:03.906896 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 18:50:03 crc kubenswrapper[4842]: I0311 18:50:03.907514 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:03 crc kubenswrapper[4842]: I0311 18:50:03.907553 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:03 crc kubenswrapper[4842]: I0311 18:50:03.907566 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:03 crc kubenswrapper[4842]: I0311 18:50:03.907597 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:50:03 crc kubenswrapper[4842]: E0311 18:50:03.912791 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 18:50:04 crc kubenswrapper[4842]: I0311 18:50:04.908191 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:04 crc kubenswrapper[4842]: I0311 18:50:04.966031 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:50:04 crc kubenswrapper[4842]: I0311 18:50:04.966172 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 18:50:05 crc kubenswrapper[4842]: E0311 18:50:05.033419 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:50:05 crc kubenswrapper[4842]: I0311 18:50:05.909710 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:06 crc kubenswrapper[4842]: I0311 18:50:06.904441 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:06 crc kubenswrapper[4842]: W0311 18:50:06.904651 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 11 18:50:06 crc kubenswrapper[4842]: E0311 18:50:06.904740 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 11 18:50:07 crc kubenswrapper[4842]: I0311 18:50:07.907628 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:08 crc kubenswrapper[4842]: I0311 18:50:08.907490 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.501954 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc82e06cfc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,LastTimestamp:2026-03-11 18:49:14.897665276 +0000 UTC m=+0.545361576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.506569 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.511045 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.516192 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85ff0b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,LastTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.522570 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc8a736114 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:15.024736532 +0000 UTC m=+0.672432812,LastTimestamp:2026-03-11 18:49:15.024736532 +0000 UTC m=+0.672432812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.526883 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fe4d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:15.063068135 +0000 UTC m=+0.710764405,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.533683 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fee61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:15.063080946 +0000 UTC m=+0.710777226,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.540513 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85ff0b90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85ff0b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,LastTimestamp:2026-03-11 18:49:15.063089526 +0000 UTC m=+0.710785806,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.547732 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fe4d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:15.063956322 +0000 UTC m=+0.711652602,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.552886 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fee61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:15.063967082 +0000 UTC m=+0.711663362,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.559078 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85ff0b90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85ff0b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,LastTimestamp:2026-03-11 18:49:15.063976352 +0000 UTC m=+0.711672622,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.564208 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fe4d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:15.064804598 +0000 UTC m=+0.712500878,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.569425 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fee61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:15.064821228 +0000 UTC m=+0.712517508,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.574948 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85ff0b90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85ff0b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,LastTimestamp:2026-03-11 18:49:15.064833748 +0000 UTC m=+0.712530028,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.578836 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fe4d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:15.064889629 +0000 UTC m=+0.712585899,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.585017 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fee61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:15.06490523 +0000 UTC m=+0.712601510,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.590390 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85ff0b90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85ff0b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,LastTimestamp:2026-03-11 18:49:15.06491373 +0000 UTC m=+0.712610010,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.595873 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fe4d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:15.065213355 +0000 UTC m=+0.712909635,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.598698 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fee61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:15.065237236 +0000 UTC m=+0.712933516,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.602412 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85ff0b90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85ff0b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,LastTimestamp:2026-03-11 18:49:15.065261256 +0000 UTC m=+0.712957536,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.605422 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fe4d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:15.068868233 +0000 UTC m=+0.716564513,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.611714 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fee61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:15.068925675 +0000 UTC m=+0.716621955,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.618154 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85ff0b90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85ff0b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.9500036 +0000 UTC m=+0.597699870,LastTimestamp:2026-03-11 18:49:15.068987146 +0000 UTC m=+0.716683426,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.625066 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fe4d85\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fe4d85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.949954949 +0000 UTC m=+0.597651229,LastTimestamp:2026-03-11 18:49:15.069171439 +0000 UTC m=+0.716867719,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.630739 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bddfc85fee61a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bddfc85fee61a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:14.94999401 +0000 UTC m=+0.597690290,LastTimestamp:2026-03-11 18:49:15.06922903 +0000 UTC m=+0.716925310,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.637531 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfca3f5ad70 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:15.45270616 +0000 UTC m=+1.100402450,LastTimestamp:2026-03-11 18:49:15.45270616 +0000 UTC m=+1.100402450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.645657 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bddfca4ebb2a6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:15.46882935 +0000 UTC m=+1.116525650,LastTimestamp:2026-03-11 18:49:15.46882935 +0000 UTC m=+1.116525650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.650420 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfca4ebb292 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:15.46882933 +0000 UTC m=+1.116525630,LastTimestamp:2026-03-11 18:49:15.46882933 +0000 UTC m=+1.116525630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.654871 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfca528ae27 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:15.472825895 +0000 UTC m=+1.120522165,LastTimestamp:2026-03-11 18:49:15.472825895 +0000 UTC m=+1.120522165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.661967 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfca5ed995d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:15.485731165 +0000 UTC m=+1.133427445,LastTimestamp:2026-03-11 18:49:15.485731165 +0000 UTC m=+1.133427445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.666546 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcc8a54a22 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.06819485 +0000 UTC m=+1.715891170,LastTimestamp:2026-03-11 18:49:16.06819485 +0000 UTC m=+1.715891170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.671200 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfcc8a68f7d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.068278141 +0000 UTC m=+1.715974421,LastTimestamp:2026-03-11 18:49:16.068278141 +0000 UTC m=+1.715974421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.676666 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfcc8a7b4db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.068353243 +0000 UTC m=+1.716049523,LastTimestamp:2026-03-11 18:49:16.068353243 +0000 UTC m=+1.716049523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.684252 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfcc8aa479e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.068521886 +0000 UTC m=+1.716218166,LastTimestamp:2026-03-11 18:49:16.068521886 +0000 UTC m=+1.716218166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.690934 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bddfcc8aa877c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.068538236 +0000 UTC m=+1.716234556,LastTimestamp:2026-03-11 18:49:16.068538236 +0000 UTC m=+1.716234556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.696310 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfcc97f854e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.082496846 +0000 UTC m=+1.730193126,LastTimestamp:2026-03-11 18:49:16.082496846 +0000 UTC m=+1.730193126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.700724 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcc9916d60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.083670368 +0000 UTC m=+1.731366648,LastTimestamp:2026-03-11 18:49:16.083670368 +0000 UTC m=+1.731366648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.707468 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfcc993a1f5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.083814901 +0000 UTC m=+1.731511221,LastTimestamp:2026-03-11 18:49:16.083814901 +0000 UTC m=+1.731511221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.712856 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfcc997b9d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.084083156 +0000 UTC m=+1.731779446,LastTimestamp:2026-03-11 18:49:16.084083156 +0000 UTC m=+1.731779446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.718306 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcc9a4540f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.084909071 +0000 UTC m=+1.732605341,LastTimestamp:2026-03-11 18:49:16.084909071 +0000 UTC m=+1.732605341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.724179 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bddfcc9b88498 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.086232216 +0000 UTC m=+1.733928506,LastTimestamp:2026-03-11 18:49:16.086232216 +0000 UTC m=+1.733928506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.731734 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcdcfd6e5f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.409515615 +0000 UTC m=+2.057211895,LastTimestamp:2026-03-11 18:49:16.409515615 +0000 UTC m=+2.057211895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.736502 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcde09b5ee openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.427097582 +0000 UTC m=+2.074793892,LastTimestamp:2026-03-11 18:49:16.427097582 +0000 UTC m=+2.074793892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.740511 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcde278ef9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.429053689 +0000 UTC m=+2.076749999,LastTimestamp:2026-03-11 18:49:16.429053689 +0000 UTC m=+2.076749999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.744810 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcec0d03ff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.662195199 +0000 UTC m=+2.309891509,LastTimestamp:2026-03-11 18:49:16.662195199 +0000 UTC m=+2.309891509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.749556 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcecda3d94 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.67564482 +0000 UTC m=+2.323341110,LastTimestamp:2026-03-11 18:49:16.67564482 +0000 UTC m=+2.323341110,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.754328 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcecf39b55 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.677307221 +0000 UTC m=+2.325003541,LastTimestamp:2026-03-11 18:49:16.677307221 +0000 UTC m=+2.325003541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.758908 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcfa131079 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.897472633 +0000 UTC m=+2.545168913,LastTimestamp:2026-03-11 18:49:16.897472633 +0000 UTC m=+2.545168913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.763488 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcfb20a10c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.915138828 +0000 UTC m=+2.562835108,LastTimestamp:2026-03-11 18:49:16.915138828 +0000 UTC m=+2.562835108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.767764 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfcff0efe0c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.981091852 +0000 UTC m=+2.628788132,LastTimestamp:2026-03-11 18:49:16.981091852 +0000 UTC m=+2.628788132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.772142 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bddfcff41d428 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.984423464 +0000 UTC m=+2.632119744,LastTimestamp:2026-03-11 18:49:16.984423464 +0000 UTC m=+2.632119744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.776606 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfcff83f0ad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.988756141 +0000 UTC m=+2.636452431,LastTimestamp:2026-03-11 18:49:16.988756141 +0000 UTC m=+2.636452431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.780723 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfcffd0a7e9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.993783785 +0000 UTC m=+2.641480085,LastTimestamp:2026-03-11 18:49:16.993783785 +0000 UTC m=+2.641480085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.784680 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bddfd0bedb683 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.197014659 +0000 UTC m=+2.844710929,LastTimestamp:2026-03-11 18:49:17.197014659 +0000 UTC m=+2.844710929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.788720 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd0c11d2a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.199381157 +0000 UTC m=+2.847077437,LastTimestamp:2026-03-11 18:49:17.199381157 +0000 UTC m=+2.847077437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.792765 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd0c13a826 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.19950135 +0000 UTC m=+2.847197630,LastTimestamp:2026-03-11 18:49:17.19950135 +0000 UTC m=+2.847197630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.797490 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd0c2175ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.200405933 +0000 UTC m=+2.848102213,LastTimestamp:2026-03-11 18:49:17.200405933 +0000 UTC m=+2.848102213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.801716 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd0cbd7c24 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.210631204 +0000 UTC m=+2.858327484,LastTimestamp:2026-03-11 18:49:17.210631204 +0000 UTC m=+2.858327484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.805327 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd0cda972f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.212538671 +0000 UTC m=+2.860234951,LastTimestamp:2026-03-11 18:49:17.212538671 +0000 UTC m=+2.860234951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.807554 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd0cef3175 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.213888885 +0000 UTC m=+2.861585175,LastTimestamp:2026-03-11 18:49:17.213888885 +0000 UTC m=+2.861585175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.809624 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd0d0c0ee5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.215780581 +0000 UTC m=+2.863476861,LastTimestamp:2026-03-11 18:49:17.215780581 +0000 UTC m=+2.863476861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.811353 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bddfd0d1612d3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.216436947 +0000 UTC m=+2.864133227,LastTimestamp:2026-03-11 18:49:17.216436947 +0000 UTC m=+2.864133227,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.814316 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd18b49b92 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.411376018 +0000 UTC m=+3.059072298,LastTimestamp:2026-03-11 18:49:17.411376018 +0000 UTC m=+3.059072298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.816014 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd18cf34ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.413119231 +0000 UTC m=+3.060815511,LastTimestamp:2026-03-11 18:49:17.413119231 +0000 UTC m=+3.060815511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.817751 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd19aa6686 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.427484294 +0000 UTC m=+3.075180574,LastTimestamp:2026-03-11 18:49:17.427484294 +0000 UTC m=+3.075180574,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.820636 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd19ac3ce3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.427604707 +0000 UTC m=+3.075301027,LastTimestamp:2026-03-11 18:49:17.427604707 +0000 UTC m=+3.075301027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.824758 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd19bcfe54 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.428702804 +0000 UTC m=+3.076399124,LastTimestamp:2026-03-11 18:49:17.428702804 +0000 UTC m=+3.076399124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.828050 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd19c90e24 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.429493284 +0000 UTC m=+3.077189564,LastTimestamp:2026-03-11 18:49:17.429493284 +0000 UTC m=+3.077189564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.834357 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd2544d7e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.622155238 +0000 UTC m=+3.269851518,LastTimestamp:2026-03-11 18:49:17.622155238 +0000 UTC m=+3.269851518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.838999 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd25db4ca1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.632015521 +0000 UTC m=+3.279711801,LastTimestamp:2026-03-11 18:49:17.632015521 +0000 UTC m=+3.279711801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.842907 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd26081bc5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.634952133 +0000 UTC m=+3.282648413,LastTimestamp:2026-03-11 18:49:17.634952133 +0000 UTC m=+3.282648413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.847944 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd26170db9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.635931577 +0000 UTC m=+3.283627857,LastTimestamp:2026-03-11 18:49:17.635931577 +0000 UTC m=+3.283627857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.852333 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bddfd270a77cc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.65188398 +0000 UTC m=+3.299580260,LastTimestamp:2026-03-11 18:49:17.65188398 +0000 UTC m=+3.299580260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.856459 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd2b2e3564 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.72133514 +0000 UTC m=+3.369031420,LastTimestamp:2026-03-11 18:49:17.72133514 +0000 UTC m=+3.369031420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.860380 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd2ebecd63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.781142883 +0000 UTC m=+3.428839163,LastTimestamp:2026-03-11 18:49:17.781142883 +0000 UTC m=+3.428839163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.864577 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd2f7a3253 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.793423955 +0000 UTC m=+3.441120235,LastTimestamp:2026-03-11 18:49:17.793423955 +0000 UTC m=+3.441120235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.868708 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd2f88af21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.794373409 +0000 UTC m=+3.442069689,LastTimestamp:2026-03-11 18:49:17.794373409 +0000 UTC m=+3.442069689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.872813 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd3a2a3a41 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.972732481 +0000 UTC m=+3.620428771,LastTimestamp:2026-03-11 18:49:17.972732481 +0000 UTC m=+3.620428771,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.876819 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd3aeb71e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.985395173 +0000 UTC m=+3.633091463,LastTimestamp:2026-03-11 18:49:17.985395173 +0000 UTC m=+3.633091463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.882023 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd3bfe6fae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:18.003417006 +0000 UTC m=+3.651113296,LastTimestamp:2026-03-11 18:49:18.003417006 +0000 UTC m=+3.651113296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.886434 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd472d8a72 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:18.191053426 +0000 UTC m=+3.838749726,LastTimestamp:2026-03-11 18:49:18.191053426 +0000 UTC m=+3.838749726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.890048 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd47d0f0cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:18.201761999 +0000 UTC m=+3.849458289,LastTimestamp:2026-03-11 18:49:18.201761999 +0000 UTC m=+3.849458289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.895106 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd7826666e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.012669038 +0000 UTC m=+4.660365328,LastTimestamp:2026-03-11 18:49:19.012669038 +0000 UTC m=+4.660365328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.898986 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd845e1d11 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.217646865 +0000 UTC m=+4.865343165,LastTimestamp:2026-03-11 18:49:19.217646865 +0000 UTC m=+4.865343165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: I0311 18:50:09.903675 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.903742 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd84f2be21 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.227387425 +0000 UTC m=+4.875083725,LastTimestamp:2026-03-11 18:49:19.227387425 +0000 UTC m=+4.875083725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.907206 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd850d6dd2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.229136338 +0000 UTC m=+4.876832628,LastTimestamp:2026-03-11 18:49:19.229136338 +0000 UTC m=+4.876832628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.910016 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd90f4df20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.428853536 +0000 UTC m=+5.076549816,LastTimestamp:2026-03-11 18:49:19.428853536 +0000 UTC m=+5.076549816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.913581 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd919d466c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.439890028 +0000 UTC m=+5.087586308,LastTimestamp:2026-03-11 18:49:19.439890028 +0000 UTC m=+5.087586308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.918111 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd91ac44ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.440872682 +0000 UTC m=+5.088568972,LastTimestamp:2026-03-11 18:49:19.440872682 +0000 UTC m=+5.088568972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.921615 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfd9fc32b49 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.677254473 +0000 UTC m=+5.324950753,LastTimestamp:2026-03-11 18:49:19.677254473 +0000 UTC m=+5.324950753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.925239 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfda0616bba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.687625658 +0000 UTC m=+5.335321968,LastTimestamp:2026-03-11 18:49:19.687625658 +0000 UTC m=+5.335321968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.928869 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfda07fda3b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.689620027 +0000 UTC m=+5.337316357,LastTimestamp:2026-03-11 18:49:19.689620027 +0000 UTC m=+5.337316357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.934373 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfdaf8dadc3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.942184387 +0000 UTC m=+5.589880677,LastTimestamp:2026-03-11 18:49:19.942184387 +0000 UTC m=+5.589880677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.939311 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfdb075ab8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.957388171 +0000 UTC m=+5.605084461,LastTimestamp:2026-03-11 18:49:19.957388171 +0000 UTC m=+5.605084461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.944657 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfdb089b6af openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:19.958701743 +0000 UTC m=+5.606398033,LastTimestamp:2026-03-11 18:49:19.958701743 +0000 UTC m=+5.606398033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.949372 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfdbfbe169f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:20.213792415 +0000 UTC m=+5.861488735,LastTimestamp:2026-03-11 18:49:20.213792415 +0000 UTC m=+5.861488735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.953677 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bddfdc07918b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:20.226048177 +0000 UTC m=+5.873744487,LastTimestamp:2026-03-11 18:49:20.226048177 +0000 UTC m=+5.873744487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.961404 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 18:50:09 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-controller-manager-crc.189bddfedaf55228 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 11 18:50:09 crc kubenswrapper[4842]: body: Mar 11 18:50:09 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:24.965364264 +0000 UTC m=+10.613060544,LastTimestamp:2026-03-11 18:49:24.965364264 +0000 UTC m=+10.613060544,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:09 crc kubenswrapper[4842]: > Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.965895 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfedaf658e5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:24.965431525 +0000 UTC m=+10.613127805,LastTimestamp:2026-03-11 18:49:24.965431525 +0000 UTC m=+10.613127805,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.970174 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 18:50:09 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-apiserver-crc.189bddffe8049390 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 18:50:09 crc kubenswrapper[4842]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 18:50:09 crc kubenswrapper[4842]: Mar 11 18:50:09 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:29.479435152 +0000 UTC m=+15.127131432,LastTimestamp:2026-03-11 18:49:29.479435152 +0000 UTC m=+15.127131432,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:09 crc kubenswrapper[4842]: > Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.974713 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddffe80517b7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:29.479468983 +0000 UTC m=+15.127165263,LastTimestamp:2026-03-11 18:49:29.479468983 +0000 UTC m=+15.127165263,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.978912 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 18:50:09 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-apiserver-crc.189bddffe87d09be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 18:50:09 crc kubenswrapper[4842]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 11 18:50:09 crc kubenswrapper[4842]: Mar 11 18:50:09 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:29.487329726 +0000 UTC m=+15.135026006,LastTimestamp:2026-03-11 18:49:29.487329726 +0000 UTC m=+15.135026006,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:09 crc kubenswrapper[4842]: > Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.983080 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bddffe80517b7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddffe80517b7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:29.479468983 +0000 UTC m=+15.127165263,LastTimestamp:2026-03-11 18:49:29.487358747 +0000 UTC m=+15.135055017,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.986619 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 18:50:09 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-apiserver-crc.189bde0001c3a6c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 11 18:50:09 crc kubenswrapper[4842]: body: Mar 11 18:50:09 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:29.911387844 +0000 UTC m=+15.559084154,LastTimestamp:2026-03-11 18:49:29.911387844 +0000 UTC m=+15.559084154,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:09 crc kubenswrapper[4842]: > Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.990765 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bde0001c49388 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:29.911448456 +0000 UTC m=+15.559144766,LastTimestamp:2026-03-11 18:49:29.911448456 +0000 UTC m=+15.559144766,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:09 crc kubenswrapper[4842]: E0311 18:50:09.995761 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bddfd2f88af21\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bddfd2f88af21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:17.794373409 +0000 UTC m=+3.442069689,LastTimestamp:2026-03-11 18:49:30.054546327 +0000 UTC m=+15.702242607,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.000534 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 18:50:10 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-controller-manager-crc.189bde012f06f580 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 18:50:10 crc kubenswrapper[4842]: body: Mar 11 18:50:10 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:34.965740928 +0000 UTC m=+20.613437248,LastTimestamp:2026-03-11 18:49:34.965740928 +0000 UTC m=+20.613437248,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:10 crc kubenswrapper[4842]: > Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.006658 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bde012f084c84 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:34.96582874 +0000 UTC m=+20.613525060,LastTimestamp:2026-03-11 18:49:34.96582874 +0000 UTC m=+20.613525060,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.012344 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bde012f06f580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 18:50:10 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-controller-manager-crc.189bde012f06f580 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 18:50:10 crc kubenswrapper[4842]: body: Mar 11 18:50:10 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:34.965740928 +0000 UTC m=+20.613437248,LastTimestamp:2026-03-11 18:49:44.965491534 +0000 UTC m=+30.613187844,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:10 crc kubenswrapper[4842]: > Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.016753 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bde012f084c84\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bde012f084c84 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:34.96582874 +0000 UTC m=+20.613525060,LastTimestamp:2026-03-11 18:49:44.965573697 +0000 UTC m=+30.613270007,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.021553 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bde0383455494 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:44.969049236 +0000 UTC m=+30.616745546,LastTimestamp:2026-03-11 18:49:44.969049236 +0000 UTC m=+30.616745546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.028797 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bddfcc9a4540f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcc9a4540f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.084909071 +0000 UTC m=+1.732605341,LastTimestamp:2026-03-11 18:49:45.089619855 +0000 UTC m=+30.737316165,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.035898 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bddfcdcfd6e5f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcdcfd6e5f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.409515615 +0000 UTC m=+2.057211895,LastTimestamp:2026-03-11 18:49:45.313795083 +0000 UTC m=+30.961491363,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.043987 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bddfcde09b5ee\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bddfcde09b5ee openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:16.427097582 +0000 UTC m=+2.074793892,LastTimestamp:2026-03-11 18:49:45.322606111 +0000 UTC m=+30.970302391,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.053357 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bde012f06f580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 18:50:10 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-controller-manager-crc.189bde012f06f580 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 18:50:10 crc kubenswrapper[4842]: body: Mar 11 18:50:10 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:34.965740928 +0000 UTC m=+20.613437248,LastTimestamp:2026-03-11 18:49:54.965363094 +0000 UTC m=+40.613059444,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:10 crc kubenswrapper[4842]: > Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.057790 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bde012f084c84\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bde012f084c84 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:34.96582874 +0000 UTC m=+20.613525060,LastTimestamp:2026-03-11 18:49:54.965516898 +0000 UTC m=+40.613213218,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.066481 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bde012f06f580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 18:50:10 crc kubenswrapper[4842]: &Event{ObjectMeta:{kube-controller-manager-crc.189bde012f06f580 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 18:50:10 crc kubenswrapper[4842]: body: Mar 11 18:50:10 crc kubenswrapper[4842]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:49:34.965740928 +0000 UTC m=+20.613437248,LastTimestamp:2026-03-11 18:50:04.966129497 +0000 UTC m=+50.613825807,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 18:50:10 crc kubenswrapper[4842]: > Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.295548 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.295821 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.297649 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.297975 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.298073 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.908010 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.909501 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.913689 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.915229 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.915319 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.915333 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:10 crc kubenswrapper[4842]: I0311 18:50:10.915369 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:50:10 crc kubenswrapper[4842]: E0311 18:50:10.922033 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 18:50:11 crc kubenswrapper[4842]: I0311 18:50:11.907838 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:12 crc kubenswrapper[4842]: I0311 18:50:12.907692 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:13 crc kubenswrapper[4842]: I0311 18:50:13.907521 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:13 crc kubenswrapper[4842]: I0311 18:50:13.962345 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:13 crc kubenswrapper[4842]: I0311 18:50:13.965060 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:13 crc kubenswrapper[4842]: I0311 18:50:13.965160 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:13 crc kubenswrapper[4842]: I0311 18:50:13.965183 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:13 crc kubenswrapper[4842]: I0311 18:50:13.966446 4842 scope.go:117] "RemoveContainer" containerID="0656ddd427d0fd89eca467ecd52f2add48519de7dd5601d0df316b647e653110" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.218755 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.902694 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.965854 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.965917 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.965972 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.966140 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.967505 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.967555 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.967574 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.968351 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 11 18:50:14 crc kubenswrapper[4842]: I0311 18:50:14.968520 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049" gracePeriod=30 Mar 11 18:50:15 crc kubenswrapper[4842]: E0311 18:50:15.034235 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.228128 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.234796 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.235334 4842 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049" exitCode=255 Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.235379 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049"} Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.235437 4842 scope.go:117] "RemoveContainer" containerID="7edae5b9b939e5967612c927811c66aa016d1528211dbec811920a99b1037acb" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.237461 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.238040 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.239928 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" exitCode=255 Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.239965 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15"} Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.240075 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.241065 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.241123 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.241143 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.241977 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:50:15 crc kubenswrapper[4842]: E0311 18:50:15.242265 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.257035 4842 scope.go:117] "RemoveContainer" containerID="0656ddd427d0fd89eca467ecd52f2add48519de7dd5601d0df316b647e653110" Mar 11 18:50:15 crc kubenswrapper[4842]: I0311 18:50:15.905520 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.245477 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.248937 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.249010 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a9aa07a66717a26f4250e54999ff868cd607095da1b1cd270e93f8f06abc9a2f"} Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.250723 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.250765 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.250778 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.251096 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 18:50:16 crc kubenswrapper[4842]: I0311 18:50:16.904963 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.263031 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.264115 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.264207 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.264287 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.906674 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:17 crc kubenswrapper[4842]: E0311 18:50:17.911811 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.922996 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.924605 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.924684 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.924698 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:17 crc kubenswrapper[4842]: I0311 18:50:17.924731 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:50:17 crc kubenswrapper[4842]: E0311 18:50:17.932790 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 18:50:18 crc kubenswrapper[4842]: I0311 18:50:18.904060 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:19 crc kubenswrapper[4842]: I0311 18:50:19.903900 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:19 crc kubenswrapper[4842]: I0311 18:50:19.910451 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:50:19 crc kubenswrapper[4842]: I0311 18:50:19.911094 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:19 crc kubenswrapper[4842]: I0311 18:50:19.912045 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:19 crc kubenswrapper[4842]: I0311 18:50:19.912135 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:19 crc kubenswrapper[4842]: I0311 18:50:19.912197 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:19 crc kubenswrapper[4842]: I0311 18:50:19.912745 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:50:19 crc kubenswrapper[4842]: E0311 18:50:19.913013 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:50:20 crc kubenswrapper[4842]: I0311 18:50:20.903500 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:21 crc kubenswrapper[4842]: I0311 18:50:21.907592 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:21 crc kubenswrapper[4842]: I0311 18:50:21.964831 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:50:21 crc kubenswrapper[4842]: I0311 18:50:21.965070 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:21 crc kubenswrapper[4842]: I0311 18:50:21.966210 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:21 crc kubenswrapper[4842]: I0311 18:50:21.966249 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:21 crc kubenswrapper[4842]: I0311 18:50:21.966258 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:21 crc kubenswrapper[4842]: I0311 18:50:21.970752 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:50:22 crc kubenswrapper[4842]: I0311 18:50:22.274848 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:22 crc kubenswrapper[4842]: I0311 18:50:22.274930 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:50:22 crc kubenswrapper[4842]: I0311 18:50:22.275938 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:22 crc kubenswrapper[4842]: I0311 18:50:22.276005 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:22 crc kubenswrapper[4842]: I0311 18:50:22.276018 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:22 crc kubenswrapper[4842]: I0311 18:50:22.903998 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:23 crc kubenswrapper[4842]: I0311 18:50:23.276806 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:23 crc kubenswrapper[4842]: I0311 18:50:23.278043 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:23 crc kubenswrapper[4842]: I0311 18:50:23.278092 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:23 crc kubenswrapper[4842]: I0311 18:50:23.278108 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:23 crc kubenswrapper[4842]: I0311 18:50:23.903318 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:24 crc kubenswrapper[4842]: I0311 18:50:24.907923 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:24 crc kubenswrapper[4842]: E0311 18:50:24.918373 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 18:50:24 crc kubenswrapper[4842]: I0311 18:50:24.933691 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:24 crc kubenswrapper[4842]: I0311 18:50:24.934947 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:24 crc kubenswrapper[4842]: I0311 18:50:24.935025 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:24 crc kubenswrapper[4842]: I0311 18:50:24.935040 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:24 crc kubenswrapper[4842]: I0311 18:50:24.935080 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:50:24 crc kubenswrapper[4842]: E0311 18:50:24.940217 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 18:50:25 crc kubenswrapper[4842]: E0311 18:50:25.034681 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:50:25 crc kubenswrapper[4842]: I0311 18:50:25.235818 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:50:25 crc kubenswrapper[4842]: I0311 18:50:25.236060 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:25 crc kubenswrapper[4842]: I0311 18:50:25.237333 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:25 crc kubenswrapper[4842]: I0311 18:50:25.237383 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:25 crc kubenswrapper[4842]: I0311 18:50:25.237403 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:25 crc kubenswrapper[4842]: I0311 18:50:25.238115 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:50:25 crc kubenswrapper[4842]: E0311 18:50:25.238483 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:50:25 crc kubenswrapper[4842]: I0311 18:50:25.907598 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:26 crc kubenswrapper[4842]: I0311 18:50:26.718966 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 18:50:26 crc kubenswrapper[4842]: I0311 18:50:26.733338 4842 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 18:50:26 crc kubenswrapper[4842]: I0311 18:50:26.903912 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:27 crc kubenswrapper[4842]: I0311 18:50:27.903413 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:28 crc kubenswrapper[4842]: I0311 18:50:28.906196 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 18:50:29 crc kubenswrapper[4842]: I0311 18:50:29.869248 4842 csr.go:261] certificate signing request csr-f8clm is approved, waiting to be issued Mar 11 18:50:29 crc kubenswrapper[4842]: I0311 18:50:29.877858 4842 csr.go:257] certificate signing request csr-f8clm is issued Mar 11 18:50:29 crc kubenswrapper[4842]: I0311 18:50:29.894439 4842 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 11 18:50:30 crc kubenswrapper[4842]: I0311 18:50:30.778730 4842 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 11 18:50:30 crc kubenswrapper[4842]: I0311 18:50:30.879723 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-02 00:51:20.853746591 +0000 UTC Mar 11 18:50:30 crc kubenswrapper[4842]: I0311 18:50:30.879828 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7110h0m49.973926859s for next certificate rotation Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.940502 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.942214 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.942284 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.942298 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.942430 4842 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.952044 4842 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.952396 4842 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 11 18:50:31 crc kubenswrapper[4842]: E0311 18:50:31.952452 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.956241 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.956352 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.956386 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.956423 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.956446 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:31Z","lastTransitionTime":"2026-03-11T18:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:31 crc kubenswrapper[4842]: E0311 18:50:31.975756 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.984832 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.984890 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.984918 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.984949 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:31 crc kubenswrapper[4842]: I0311 18:50:31.984973 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:31Z","lastTransitionTime":"2026-03-11T18:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.000766 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.009056 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.009125 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.009149 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.009183 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.009208 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:32Z","lastTransitionTime":"2026-03-11T18:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.023661 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.031795 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.031862 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.031886 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.031913 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.031936 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:32Z","lastTransitionTime":"2026-03-11T18:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.044026 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.044172 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.044304 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.144843 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.245990 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.346633 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.447508 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.547697 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.647969 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.748148 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.838259 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.838771 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.841118 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.841176 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:32 crc kubenswrapper[4842]: I0311 18:50:32.841198 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.848999 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:32 crc kubenswrapper[4842]: E0311 18:50:32.949356 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.050491 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.151102 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.251305 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.352249 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.453189 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.554571 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.655049 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.756108 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.856981 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:33 crc kubenswrapper[4842]: E0311 18:50:33.957441 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.058345 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.159404 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.259890 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.361066 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.462236 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.562917 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.664454 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.765748 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.866257 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:34 crc kubenswrapper[4842]: E0311 18:50:34.967335 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.034828 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.067536 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.168012 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.269522 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.371074 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.471568 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.572360 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.673360 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.774404 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.874646 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:35 crc kubenswrapper[4842]: E0311 18:50:35.975447 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.076040 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.176999 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.277202 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.378358 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.478720 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.579912 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.680567 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.780941 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.881373 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:36 crc kubenswrapper[4842]: E0311 18:50:36.981517 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.081952 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.182419 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.282787 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.383502 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.484148 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.584379 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.685877 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.786743 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.887837 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:37 crc kubenswrapper[4842]: E0311 18:50:37.988336 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.088818 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.189851 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.290986 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.391372 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.492210 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.593210 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.693685 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.794620 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.895673 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:38 crc kubenswrapper[4842]: I0311 18:50:38.961924 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 18:50:38 crc kubenswrapper[4842]: I0311 18:50:38.963028 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:38 crc kubenswrapper[4842]: I0311 18:50:38.963074 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:38 crc kubenswrapper[4842]: I0311 18:50:38.963088 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:38 crc kubenswrapper[4842]: I0311 18:50:38.963704 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.963898 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:50:38 crc kubenswrapper[4842]: E0311 18:50:38.996179 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.096435 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.197597 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.298031 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.399305 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.500369 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.601299 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.702070 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.802691 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:39 crc kubenswrapper[4842]: E0311 18:50:39.903695 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.004429 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.105363 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.206361 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.307352 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.407501 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.508906 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.610049 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.711427 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: I0311 18:50:40.767897 4842 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.812246 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:40 crc kubenswrapper[4842]: E0311 18:50:40.913484 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.014086 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.115240 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.216304 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.317389 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.418193 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.518410 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.619623 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.720731 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.821896 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:41 crc kubenswrapper[4842]: E0311 18:50:41.922418 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.023561 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.124225 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.224932 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.313602 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.319579 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.319630 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.319650 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.319676 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.319694 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:42Z","lastTransitionTime":"2026-03-11T18:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.337116 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.342602 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.342674 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.342719 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.342743 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.342759 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:42Z","lastTransitionTime":"2026-03-11T18:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.354577 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.360621 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.360705 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.360721 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.360761 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.360774 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:42Z","lastTransitionTime":"2026-03-11T18:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.372816 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.376154 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.376174 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.376182 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.376209 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.376219 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:42Z","lastTransitionTime":"2026-03-11T18:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.390980 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.391101 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.391153 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.492153 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.592803 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.693830 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.794308 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.895255 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.899375 4842 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.933372 4842 apiserver.go:52] "Watching apiserver" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.937931 4842 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.938189 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.938597 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.938676 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.939695 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.939773 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.940016 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.939839 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.939847 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:42 crc kubenswrapper[4842]: E0311 18:50:42.940096 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.939828 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.942259 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.942306 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.942988 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.947552 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.948434 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.948734 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.949606 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.949783 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.950016 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.971071 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.982501 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.994087 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.997072 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.997104 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.997116 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.997132 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:42 crc kubenswrapper[4842]: I0311 18:50:42.997142 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:42Z","lastTransitionTime":"2026-03-11T18:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.001770 4842 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.002986 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.011848 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.018695 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.026549 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074830 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074865 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074886 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074907 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074922 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074936 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074952 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074968 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.074984 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075000 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075014 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075032 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075045 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075060 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075077 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075091 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075105 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075116 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075129 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075149 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075173 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075187 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075203 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075219 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075234 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075248 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075264 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075310 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075337 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075352 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075371 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075387 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075402 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075419 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075435 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075450 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075471 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075489 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075504 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075506 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075520 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075535 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075550 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075552 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075568 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075583 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075599 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075614 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075630 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075644 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075659 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075675 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075690 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075705 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075721 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075735 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075736 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075749 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075763 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075778 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075795 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075812 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075827 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075844 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075858 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075874 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075888 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075902 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075919 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075935 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075950 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075966 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076055 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076072 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076088 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076103 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076122 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076136 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076150 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076167 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076183 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076200 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076215 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076230 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076245 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076261 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076295 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076310 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076325 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076340 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076354 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076368 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076384 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076402 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076417 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076434 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076450 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076465 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076480 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076496 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076511 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076526 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076542 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076558 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076573 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076588 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076603 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076618 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076633 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076648 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076662 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076678 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076694 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076710 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076726 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076742 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076757 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076774 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076789 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076803 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076820 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076836 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076852 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076868 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076918 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076935 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076951 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076968 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076985 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077001 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077017 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077033 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077048 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077065 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077080 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077096 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077113 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077130 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077144 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077160 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077175 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077190 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077206 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077225 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077253 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077286 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077315 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077344 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077363 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077378 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077395 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077412 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077428 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077444 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077460 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077476 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077492 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077507 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077523 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077541 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077556 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077573 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077590 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077605 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077621 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077645 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077666 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077691 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077707 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077724 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077742 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077758 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077774 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077792 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077808 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077827 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077843 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077861 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077878 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077894 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077912 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.075915 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076070 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076182 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076218 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076327 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076392 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076456 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076570 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076680 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076815 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076991 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.076999 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077124 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077152 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077239 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077328 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077389 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077744 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077927 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.078164 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.078234 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.078406 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.078603 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.078993 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.079358 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.079442 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.079588 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.079791 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.080102 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.080179 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.080490 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.080730 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.080945 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.080864 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081250 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081325 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081582 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081643 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.081677 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:50:43.581651885 +0000 UTC m=+89.229348375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081755 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081859 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.077934 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081908 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081938 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081963 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.081986 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082010 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082034 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082059 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082074 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082084 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082108 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082133 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082156 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082182 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082205 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082231 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082257 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082298 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082324 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082369 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082392 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082400 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082402 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082438 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082464 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082488 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082536 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082558 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082577 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082594 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082613 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082632 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082650 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082696 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082703 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082728 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082791 4842 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082810 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082824 4842 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082838 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082850 4842 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082861 4842 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082872 4842 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082883 4842 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082892 4842 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082902 4842 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082911 4842 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082920 4842 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082929 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082938 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082948 4842 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082959 4842 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082969 4842 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082978 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082989 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082998 4842 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083008 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083018 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083027 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083036 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083046 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083056 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083065 4842 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083074 4842 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083082 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083091 4842 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083101 4842 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083110 4842 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083119 4842 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083128 4842 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083136 4842 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083145 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083154 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083163 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083174 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083183 4842 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083193 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083201 4842 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083211 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083221 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083229 4842 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083238 4842 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083247 4842 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.079030 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082931 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.082921 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083327 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083371 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083540 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083860 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083841 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.083227 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.084003 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.084313 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.084416 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.084636 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.084818 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.084908 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.084962 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085102 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085130 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085510 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085477 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085596 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085750 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085764 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085807 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085796 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085898 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.085942 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.086328 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.086339 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.086361 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.086380 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.086621 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.086649 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.086642 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.087428 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.087445 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.087436 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.087662 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.087695 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.087796 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.087982 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088007 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088030 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088009 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088233 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088247 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088450 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088465 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.088873 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.089258 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.089247 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.089529 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.089549 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.089535 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.089676 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.089815 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.090678 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.090726 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.090737 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.090689 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.090931 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.090982 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.091327 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.091382 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.091415 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.091573 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.091810 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092123 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092155 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092194 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092628 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092754 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092773 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092815 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.092912 4842 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093002 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093129 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093178 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093462 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093643 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093689 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093878 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093916 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.093965 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.094076 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.094074 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.094230 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:43.59420645 +0000 UTC m=+89.241902730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.094866 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.095018 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:43.594993081 +0000 UTC m=+89.242689561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.096044 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.096263 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.096302 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.096400 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.096538 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.096564 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.096555 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.099802 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.100255 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.100286 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.100496 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.100598 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.100702 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.100805 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.101190 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.101422 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.104155 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.104048 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.104384 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.105807 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.105831 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.105848 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.105931 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:43.605904743 +0000 UTC m=+89.253601043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.107833 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.109939 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.111604 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.118532 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.118601 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.118636 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.118778 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.118812 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.118832 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.118904 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:43.618878179 +0000 UTC m=+89.266574679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.119212 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.119720 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.120421 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.120631 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.120762 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.121024 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.121359 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.121739 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.121396 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.121847 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.121937 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.121739 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.122521 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.122582 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.123783 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.124163 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.124175 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.124441 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.124520 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.124573 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.124759 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125427 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125503 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125579 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125675 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125745 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125836 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125928 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.125961 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.126940 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.127105 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.127305 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.127381 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.127876 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128105 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128406 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128469 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128593 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128624 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128717 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128762 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.128807 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.129440 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.129645 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.129671 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.131489 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.137934 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.138118 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.148931 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.153070 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184025 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184068 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184140 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184169 4842 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184186 4842 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184200 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184202 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184213 4842 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184226 4842 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184237 4842 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184249 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184260 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184288 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184301 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184313 4842 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184328 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184342 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184354 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184367 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184378 4842 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184389 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184400 4842 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184411 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184421 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184434 4842 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184446 4842 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184457 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184468 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184479 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184490 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184500 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184512 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184522 4842 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184533 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184545 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184557 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184568 4842 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184579 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184592 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184605 4842 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184615 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184626 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184637 4842 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184647 4842 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184658 4842 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184669 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184680 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184692 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184703 4842 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184715 4842 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184726 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184737 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184748 4842 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184759 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184771 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184782 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184795 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184808 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184820 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184832 4842 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184842 4842 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184864 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184876 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184888 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184899 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184909 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184920 4842 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184933 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184944 4842 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184956 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184968 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184979 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.184991 4842 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185002 4842 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185013 4842 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185026 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185037 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185049 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185060 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185072 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185111 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185122 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185135 4842 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185146 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185156 4842 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185167 4842 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185177 4842 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185188 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185198 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185211 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185221 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185231 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185242 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185253 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185285 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185298 4842 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185308 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185322 4842 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185334 4842 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185345 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185357 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185369 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185380 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185391 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185403 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185414 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185425 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185436 4842 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185448 4842 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185459 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185470 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185483 4842 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185493 4842 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185505 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185516 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185526 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185537 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185548 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185560 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185572 4842 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185584 4842 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185594 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185605 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185617 4842 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185628 4842 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185639 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185651 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185662 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185672 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185684 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185695 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185706 4842 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185717 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185728 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185739 4842 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185750 4842 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185761 4842 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185772 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185783 4842 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185796 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185810 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185823 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185835 4842 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185845 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185857 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185868 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185879 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185890 4842 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185902 4842 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185913 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.185924 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.202675 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.202707 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.202718 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.202733 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.202743 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.264190 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.271549 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.280175 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.307794 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.307850 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.307869 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.307897 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.307915 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.341226 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4022513854f6ea2c58bea6297024467da88fcc08046464fcb92c810ff2bfc8f6"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.342176 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3beccdb8296531808521bf3061fa15aa464d633e27e8a16d1047f0c48b554d1c"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.343527 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5ef61ff105309d1cf2e2e8e5bd11206b4f621534d769926b45fae87e6cd799be"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.411542 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.411594 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.411613 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.411641 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.411659 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.513713 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.513768 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.513780 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.513800 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.513811 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.589740 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.590040 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:50:44.589986432 +0000 UTC m=+90.237682742 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.617013 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.617067 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.617078 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.617098 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.617111 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.690815 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.690913 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.690967 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.691037 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691038 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691159 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:44.691133411 +0000 UTC m=+90.338829691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691266 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691339 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691370 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691432 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691537 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:44.691508091 +0000 UTC m=+90.339204411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691540 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691582 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691584 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691734 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:44.691685956 +0000 UTC m=+90.339382396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:43 crc kubenswrapper[4842]: E0311 18:50:43.691786 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:44.691764858 +0000 UTC m=+90.339461378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.719848 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.719922 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.719939 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.719966 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.719985 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.823426 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.823490 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.823512 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.823539 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.823557 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.926745 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.926775 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.926784 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.926798 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:43 crc kubenswrapper[4842]: I0311 18:50:43.926808 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:43Z","lastTransitionTime":"2026-03-11T18:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.030103 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.032046 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.032069 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.032090 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.032104 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.134931 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.134974 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.134986 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.135004 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.135020 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.237675 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.237727 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.237736 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.237752 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.237763 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.339904 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.339965 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.339977 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.339997 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.340011 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.346877 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.348844 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.348896 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.360715 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.372788 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.385779 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.395792 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.405926 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.415486 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.425882 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.439862 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.442616 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.442676 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.442693 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.442711 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.442721 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.453195 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.467582 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.480207 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.494004 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.544663 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.544728 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.544738 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.544756 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.544768 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.600552 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.600700 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:50:46.600667214 +0000 UTC m=+92.248363494 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.646679 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.646731 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.646743 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.646766 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.646778 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.701534 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.701672 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.701702 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701566 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.701753 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701766 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701781 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701808 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701824 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:46.701809434 +0000 UTC m=+92.349505714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701910 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701928 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701926 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701935 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:46.701911467 +0000 UTC m=+92.349607747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.702063 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:46.70204015 +0000 UTC m=+92.349736430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.701941 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.702196 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:46.702166153 +0000 UTC m=+92.349862453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.749338 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.749375 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.749392 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.749413 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.749425 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.851832 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.851886 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.851899 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.851920 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.851934 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.954767 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.954827 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.954844 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.954873 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.954891 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:44Z","lastTransitionTime":"2026-03-11T18:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.961351 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.961490 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.961373 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.961351 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.961560 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:44 crc kubenswrapper[4842]: E0311 18:50:44.961733 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.967111 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.967775 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.968457 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.969014 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 11 18:50:44 crc kubenswrapper[4842]: I0311 18:50:44.985405 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.007195 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.014834 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.015571 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.016150 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.016700 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.017319 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.017794 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.018255 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.018931 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.019466 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.019952 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.020468 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.020951 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.021492 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.021976 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.022235 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.024827 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.025368 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.025804 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.026710 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.027122 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.028112 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.028620 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.029594 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.030179 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.031045 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.031656 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.032567 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.033187 4842 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.033309 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.034834 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.035707 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.036398 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.038313 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.039553 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.039885 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.040207 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.041931 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.042972 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.043982 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.044662 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.046358 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.047417 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.048071 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.048946 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.053473 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.057004 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.057062 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.057074 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.057092 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.057121 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.057883 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.058950 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.062770 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.064482 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.067151 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.069060 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.072360 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.072743 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.073024 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.160341 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.160699 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.160709 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.160729 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.160739 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.198122 4842 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.263414 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.263477 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.263489 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.263509 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.263525 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.365673 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.365715 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.365726 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.365748 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.365762 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.468341 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.468386 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.468401 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.468421 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.468435 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.578727 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.578781 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.578794 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.578819 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.578833 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.681365 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.681400 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.681408 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.681421 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.681430 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.783848 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.783887 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.783895 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.783911 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.783921 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.885985 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.886029 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.886038 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.886061 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.886071 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.988840 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.988890 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.988903 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.989313 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:45 crc kubenswrapper[4842]: I0311 18:50:45.989350 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:45Z","lastTransitionTime":"2026-03-11T18:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.092382 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.092454 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.092469 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.092490 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.092507 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.195682 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.195759 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.195771 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.195791 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.195804 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.297667 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.297701 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.297712 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.297727 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.297737 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.355162 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.368865 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:46Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.383217 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:46Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.395423 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:46Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.400664 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.400796 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.400816 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.400842 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.400893 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.407593 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:46Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.422063 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:46Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.435804 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:46Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.503227 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.503317 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.503337 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.503361 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.503379 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.606034 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.606090 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.606101 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.606123 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.606135 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.621560 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.621871 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:50:50.621831886 +0000 UTC m=+96.269528176 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.709757 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.709796 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.709813 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.709830 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.709841 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.722157 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.722208 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.722225 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.722298 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722364 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722381 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722393 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722402 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722415 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722445 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:50.722422041 +0000 UTC m=+96.370118321 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722442 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722484 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722504 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722464 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:50.722455382 +0000 UTC m=+96.370151662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722600 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:50.722573155 +0000 UTC m=+96.370269635 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.722626 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:50.722615556 +0000 UTC m=+96.370312056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.811939 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.811970 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.811978 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.811992 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.812000 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.914477 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.914515 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.914525 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.914539 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.914548 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:46Z","lastTransitionTime":"2026-03-11T18:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.962030 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.962251 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.962585 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.962749 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:46 crc kubenswrapper[4842]: I0311 18:50:46.962921 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:46 crc kubenswrapper[4842]: E0311 18:50:46.963014 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.017143 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.017208 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.017225 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.017259 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.017321 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.119984 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.120048 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.120057 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.120072 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.120083 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.223302 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.223373 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.223386 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.223406 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.223419 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.325888 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.325945 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.325967 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.325988 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.326001 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.428461 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.428539 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.428562 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.428590 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.428611 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.530864 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.530914 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.530926 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.530946 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.530959 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.633894 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.633959 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.633978 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.634000 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.634011 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.736562 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.736603 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.736615 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.736635 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.736645 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.824383 4842 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.839564 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.839608 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.839623 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.839640 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.839652 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.941891 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.941976 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.941994 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.942047 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:47 crc kubenswrapper[4842]: I0311 18:50:47.942065 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:47Z","lastTransitionTime":"2026-03-11T18:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.045202 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.045241 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.045255 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.045316 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.045331 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.148152 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.148222 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.148234 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.148252 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.148281 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.251563 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.251601 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.251628 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.251645 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.251657 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.355831 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.355912 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.355920 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.355957 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.355967 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.458115 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.458164 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.458174 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.458192 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.458218 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.561342 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.561399 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.561415 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.561436 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.561448 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.663865 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.663922 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.663932 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.663984 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.663996 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.766667 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.766727 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.766738 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.766755 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.766767 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.870068 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.870103 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.870114 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.870132 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.870142 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.962158 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.962229 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.962349 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:48 crc kubenswrapper[4842]: E0311 18:50:48.962386 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:48 crc kubenswrapper[4842]: E0311 18:50:48.963017 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:48 crc kubenswrapper[4842]: E0311 18:50:48.963097 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.975347 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.975387 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.975399 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.975417 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.975432 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:48Z","lastTransitionTime":"2026-03-11T18:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.977619 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 11 18:50:48 crc kubenswrapper[4842]: I0311 18:50:48.979365 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.078419 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.078477 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.078491 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.078510 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.078522 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.181125 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.181173 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.181182 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.181198 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.181207 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.283509 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.283560 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.283580 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.283602 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.283613 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.386394 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.386455 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.386466 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.386486 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.386498 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.489254 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.489320 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.489332 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.489353 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.489367 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.591663 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.591700 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.591708 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.591724 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.591734 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.693788 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.693844 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.693856 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.693873 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.693885 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.796346 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.796394 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.796405 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.796425 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.796437 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.898489 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.898531 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.898543 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.898559 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:49 crc kubenswrapper[4842]: I0311 18:50:49.898570 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:49Z","lastTransitionTime":"2026-03-11T18:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.001088 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.001120 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.001129 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.001145 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.001153 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.103001 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.103064 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.103077 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.103112 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.103126 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.207482 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.207571 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.207592 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.207621 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.207640 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.310586 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.310635 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.310647 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.310666 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.310677 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.414411 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.414512 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.414531 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.414560 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.414584 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.517709 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.517789 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.517817 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.517850 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.517872 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.620016 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.620081 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.620099 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.620124 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.620136 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.656855 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.657123 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:50:58.65708006 +0000 UTC m=+104.304776380 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.722943 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.723026 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.723049 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.723081 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.723103 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.758454 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.758546 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.758596 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.758668 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758677 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758718 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758739 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758747 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758757 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758847 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758825 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:58.758797864 +0000 UTC m=+104.406494184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758877 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758900 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758916 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:58.758882877 +0000 UTC m=+104.406579187 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758956 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:58.758941078 +0000 UTC m=+104.406637398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.758983 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:50:58.758969979 +0000 UTC m=+104.406666299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.827110 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.827182 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.827202 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.827230 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.827250 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.930405 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.930482 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.930506 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.930536 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.930560 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:50Z","lastTransitionTime":"2026-03-11T18:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.961615 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.961617 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:50 crc kubenswrapper[4842]: I0311 18:50:50.961728 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.961912 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.962040 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:50 crc kubenswrapper[4842]: E0311 18:50:50.962232 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.033227 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.033301 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.033318 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.033337 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.033351 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.135209 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.135262 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.135311 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.135336 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.135352 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.238107 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.238166 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.238183 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.238213 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.238231 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.341285 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.341334 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.341345 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.341362 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.341376 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.443582 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.443620 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.443629 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.443646 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.443661 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.545946 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.545997 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.546010 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.546035 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.546046 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.648520 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.648567 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.648578 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.648597 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.648609 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.751031 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.751074 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.751084 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.751104 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.751117 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.853223 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.853333 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.853360 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.853389 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.853411 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.957091 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.957221 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.957248 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.957309 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:51 crc kubenswrapper[4842]: I0311 18:50:51.957336 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:51Z","lastTransitionTime":"2026-03-11T18:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.059937 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.060015 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.060039 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.060069 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.060093 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.163407 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.163472 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.163486 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.163506 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.163540 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.266593 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.266680 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.266698 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.266724 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.266742 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.369832 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.369881 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.369892 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.369910 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.369923 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.473008 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.473055 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.473064 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.473081 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.473090 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.516307 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.516391 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.516416 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.516447 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.516470 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.536393 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:52Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.541192 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.541339 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.541378 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.541411 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.541434 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.561030 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:52Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.565766 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.565807 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.565816 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.565836 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.565850 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.583176 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:52Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.587020 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.587084 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.587102 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.587128 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.587145 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.622057 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:52Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.627371 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.627432 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.627447 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.627469 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.627482 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.643412 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:52Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.643623 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.645447 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.645490 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.645507 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.645533 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.645550 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.749014 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.749143 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.749170 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.749195 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.749241 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.851975 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.852020 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.852030 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.852049 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.852076 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.955161 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.955240 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.955261 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.955314 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.955334 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:52Z","lastTransitionTime":"2026-03-11T18:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.961889 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.961892 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.962824 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.962877 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.963174 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.963447 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.979940 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:50:52 crc kubenswrapper[4842]: I0311 18:50:52.980055 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 18:50:52 crc kubenswrapper[4842]: E0311 18:50:52.980191 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.058791 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.058895 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.058919 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.058955 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.058981 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.161877 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.161943 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.161962 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.161986 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.162004 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.264356 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.264400 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.264411 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.264429 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.264442 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.367075 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.367134 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.367151 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.367180 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.367199 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.372528 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:50:53 crc kubenswrapper[4842]: E0311 18:50:53.372706 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.470891 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.470935 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.470944 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.470960 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.470970 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.573970 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.574023 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.574035 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.574057 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.574071 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.676812 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.676851 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.676862 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.676880 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.676891 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.779500 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.779536 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.779544 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.779560 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.779570 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.882492 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.882536 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.882549 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.882569 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.882581 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.985432 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.985496 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.985514 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.985537 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:53 crc kubenswrapper[4842]: I0311 18:50:53.985554 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:53Z","lastTransitionTime":"2026-03-11T18:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.088095 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.088141 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.088153 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.088172 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.088186 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.191251 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.191303 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.191315 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.191331 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.191342 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.293632 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.293699 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.293726 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.293757 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.293782 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.396499 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.396562 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.396582 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.396606 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.396623 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.499692 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.500016 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.500125 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.500221 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.500337 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.604337 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.604789 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.605018 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.605231 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.605609 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.708430 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.708490 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.708508 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.708537 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.708556 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.811759 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.811832 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.811844 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.811883 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.811894 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.913982 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.914032 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.914045 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.914069 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.914081 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:54Z","lastTransitionTime":"2026-03-11T18:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.961575 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.961620 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.961710 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:54 crc kubenswrapper[4842]: E0311 18:50:54.961838 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:54 crc kubenswrapper[4842]: E0311 18:50:54.962129 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:54 crc kubenswrapper[4842]: E0311 18:50:54.962192 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.977994 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:54 crc kubenswrapper[4842]: I0311 18:50:54.990657 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.008624 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.016729 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.016786 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.016804 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.016829 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.016846 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.035827 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.060024 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.076484 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.086721 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.103816 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.114313 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:50:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.118531 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.118704 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.118792 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.118879 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.118981 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.221250 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.221355 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.221377 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.221410 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.221432 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.323665 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.323739 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.323763 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.323795 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.323817 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.427631 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.427692 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.427706 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.427728 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.427746 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.530899 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.530943 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.530954 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.530969 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.530981 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.633151 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.633181 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.633192 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.633206 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.633216 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.736368 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.736447 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.736474 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.736505 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.736531 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.839846 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.839899 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.839916 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.839940 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.839967 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.941866 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.941907 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.941917 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.941938 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:55 crc kubenswrapper[4842]: I0311 18:50:55.941951 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:55Z","lastTransitionTime":"2026-03-11T18:50:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.044470 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.044529 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.044539 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.044555 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.044566 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.147595 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.147643 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.147652 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.147668 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.147678 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.250406 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.250470 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.250490 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.250516 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.250532 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.352966 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.353038 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.353056 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.353083 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.353101 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.456352 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.456403 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.456574 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.456619 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.456638 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.559599 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.559654 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.559670 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.559696 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.559714 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.666073 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.666484 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.666675 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.666829 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.666970 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.769690 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.769750 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.769770 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.769794 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.769812 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.873211 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.873357 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.873380 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.873452 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.873476 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.961195 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.961310 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.961194 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:56 crc kubenswrapper[4842]: E0311 18:50:56.961507 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:56 crc kubenswrapper[4842]: E0311 18:50:56.961631 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:56 crc kubenswrapper[4842]: E0311 18:50:56.961731 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.975017 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.975051 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.975077 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.975094 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:56 crc kubenswrapper[4842]: I0311 18:50:56.975103 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:56Z","lastTransitionTime":"2026-03-11T18:50:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.078193 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.078240 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.078249 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.078264 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.078296 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.183031 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.183305 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.183335 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.183367 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.183389 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.286291 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.286647 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.286811 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.286975 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.287126 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.389111 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.389366 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.389455 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.389538 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.389620 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.491996 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.492085 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.492105 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.492138 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.492158 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.594189 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.594227 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.594238 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.594256 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.594266 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.697513 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.697581 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.697594 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.697616 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.697626 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.800392 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.800456 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.800475 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.800504 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.800522 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.903051 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.903422 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.903588 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.903740 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:57 crc kubenswrapper[4842]: I0311 18:50:57.903859 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:57Z","lastTransitionTime":"2026-03-11T18:50:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.006449 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.006892 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.007027 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.007191 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.007385 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.114752 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.114809 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.114831 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.114863 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.114883 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.217457 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.217518 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.217535 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.217564 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.217581 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.320657 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.320739 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.320758 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.320788 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.320806 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.423100 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.423175 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.423193 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.423221 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.423239 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.525943 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.526005 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.526022 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.526048 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.526065 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.628939 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.629065 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.629093 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.629129 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.629152 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.732301 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.732363 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.732381 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.732411 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.732429 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.732769 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.733007 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:51:14.732970667 +0000 UTC m=+120.380666987 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.833372 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.833436 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.833473 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.833509 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833577 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833618 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833684 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:14.833659305 +0000 UTC m=+120.481355615 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833708 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:14.833697246 +0000 UTC m=+120.481393566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833755 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833790 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833816 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833857 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833922 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833945 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.833993 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:14.833903821 +0000 UTC m=+120.481600131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.834072 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:14.834050265 +0000 UTC m=+120.481746585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.835091 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.835142 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.835158 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.835183 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.835201 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.938258 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.938344 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.938365 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.938393 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.938413 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:58Z","lastTransitionTime":"2026-03-11T18:50:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.961965 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.962014 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:50:58 crc kubenswrapper[4842]: I0311 18:50:58.962014 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.962320 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.962723 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:50:58 crc kubenswrapper[4842]: E0311 18:50:58.963140 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.041017 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.041541 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.041721 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.041931 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.042121 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.144623 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.144689 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.144706 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.144813 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.144834 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.247576 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.247626 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.247640 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.247661 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.247674 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.351135 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.351225 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.351255 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.351333 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.351359 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.454119 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.454171 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.454183 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.454203 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.454215 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.556786 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.556843 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.556861 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.556888 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.556905 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.661817 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.661911 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.661934 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.661963 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.661983 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.765871 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.765938 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.765955 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.765982 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.765999 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.873049 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.873204 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.873234 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.873315 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.873343 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.976781 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.976853 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.976879 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.976913 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:50:59 crc kubenswrapper[4842]: I0311 18:50:59.976935 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:50:59Z","lastTransitionTime":"2026-03-11T18:50:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.080297 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.080367 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.080376 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.080391 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.080400 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.183076 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.183113 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.183121 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.183136 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.183146 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.285688 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.285759 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.285777 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.285806 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.285824 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.388601 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.388678 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.388705 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.388736 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.388756 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.491833 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.491896 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.491914 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.491940 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.491956 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.594359 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.594402 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.594411 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.594427 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.594438 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.697937 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.697985 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.698001 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.698025 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.698042 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.736255 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8lrw5"] Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.736721 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.739501 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.740700 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.741078 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.753749 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.792307 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.800005 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.800038 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.800047 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.800061 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.800072 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.807190 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.818339 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.830604 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.847618 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.855460 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51f8b6f6-1b94-408b-ad7f-989d62fa1ba5-hosts-file\") pod \"node-resolver-8lrw5\" (UID: \"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\") " pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.855691 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwcxh\" (UniqueName: \"kubernetes.io/projected/51f8b6f6-1b94-408b-ad7f-989d62fa1ba5-kube-api-access-bwcxh\") pod \"node-resolver-8lrw5\" (UID: \"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\") " pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.859797 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.876587 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.891188 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.902487 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.902526 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.902535 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.902551 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.902560 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:00Z","lastTransitionTime":"2026-03-11T18:51:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.904783 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:00Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.956572 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51f8b6f6-1b94-408b-ad7f-989d62fa1ba5-hosts-file\") pod \"node-resolver-8lrw5\" (UID: \"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\") " pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.956817 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwcxh\" (UniqueName: \"kubernetes.io/projected/51f8b6f6-1b94-408b-ad7f-989d62fa1ba5-kube-api-access-bwcxh\") pod \"node-resolver-8lrw5\" (UID: \"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\") " pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.957260 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51f8b6f6-1b94-408b-ad7f-989d62fa1ba5-hosts-file\") pod \"node-resolver-8lrw5\" (UID: \"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\") " pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.961454 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:00 crc kubenswrapper[4842]: E0311 18:51:00.961701 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.961543 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.961499 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:00 crc kubenswrapper[4842]: E0311 18:51:00.962052 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:00 crc kubenswrapper[4842]: E0311 18:51:00.961967 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:00 crc kubenswrapper[4842]: I0311 18:51:00.976170 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwcxh\" (UniqueName: \"kubernetes.io/projected/51f8b6f6-1b94-408b-ad7f-989d62fa1ba5-kube-api-access-bwcxh\") pod \"node-resolver-8lrw5\" (UID: \"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\") " pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.005246 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.005312 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.005325 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.005342 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.005354 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.068392 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8lrw5" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.108573 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.108946 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.108971 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.109000 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.109022 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.124295 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2hhn6"] Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.124938 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.126959 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-csjgs"] Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.127500 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9zrff"] Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.127953 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.128453 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.129824 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.130181 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.130220 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.130299 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.130402 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.130570 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.131075 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.131098 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.131161 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.132029 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.132034 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.132564 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.143935 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.157873 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.158899 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qktb2\" (UniqueName: \"kubernetes.io/projected/fc1cbff7-1f6e-4717-91f1-02477203145c-kube-api-access-qktb2\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.159050 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-system-cni-dir\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.159195 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-kubelet\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.159336 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-conf-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.159491 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-cnibin\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.159604 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-k8s-cni-cncf-io\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.159720 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-hostroot\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.159924 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-socket-dir-parent\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.162707 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.162852 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc1cbff7-1f6e-4717-91f1-02477203145c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.162958 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-system-cni-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163062 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z597f\" (UniqueName: \"kubernetes.io/projected/3827ef7b-1abd-4dea-acf3-474eed7b3860-kube-api-access-z597f\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163155 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-cnibin\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163267 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3827ef7b-1abd-4dea-acf3-474eed7b3860-cni-binary-copy\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163442 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-etc-kubernetes\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163565 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-multus-certs\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163703 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12f22b8b-b227-48b3-b1f1-322dfe40e383-mcd-auth-proxy-config\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163840 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc1cbff7-1f6e-4717-91f1-02477203145c-cni-binary-copy\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.163997 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-cni-bin\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.164108 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12f22b8b-b227-48b3-b1f1-322dfe40e383-proxy-tls\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.164219 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-cni-multus\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.164419 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-daemon-config\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.164505 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/12f22b8b-b227-48b3-b1f1-322dfe40e383-rootfs\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.164554 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ln64\" (UniqueName: \"kubernetes.io/projected/12f22b8b-b227-48b3-b1f1-322dfe40e383-kube-api-access-7ln64\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.164606 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-os-release\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.165129 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-cni-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.165212 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-os-release\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.165308 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-netns\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.174511 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.191418 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.204829 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.211100 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.211133 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.211149 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.211165 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.211176 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.220222 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.234035 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.259740 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266156 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-cnibin\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266195 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-k8s-cni-cncf-io\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266219 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-hostroot\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266259 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-socket-dir-parent\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266323 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266356 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc1cbff7-1f6e-4717-91f1-02477203145c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266379 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-system-cni-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266400 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z597f\" (UniqueName: \"kubernetes.io/projected/3827ef7b-1abd-4dea-acf3-474eed7b3860-kube-api-access-z597f\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266423 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-cnibin\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266444 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3827ef7b-1abd-4dea-acf3-474eed7b3860-cni-binary-copy\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266465 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-etc-kubernetes\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266484 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-multus-certs\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266504 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12f22b8b-b227-48b3-b1f1-322dfe40e383-mcd-auth-proxy-config\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266526 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc1cbff7-1f6e-4717-91f1-02477203145c-cni-binary-copy\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266715 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-cni-bin\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266741 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12f22b8b-b227-48b3-b1f1-322dfe40e383-proxy-tls\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266799 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-cni-multus\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266820 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-daemon-config\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266838 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/12f22b8b-b227-48b3-b1f1-322dfe40e383-rootfs\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266858 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ln64\" (UniqueName: \"kubernetes.io/projected/12f22b8b-b227-48b3-b1f1-322dfe40e383-kube-api-access-7ln64\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266880 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-os-release\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266899 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-cni-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266922 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-os-release\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.266942 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-netns\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267005 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qktb2\" (UniqueName: \"kubernetes.io/projected/fc1cbff7-1f6e-4717-91f1-02477203145c-kube-api-access-qktb2\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267032 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-system-cni-dir\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267063 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-kubelet\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267096 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-conf-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267176 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-conf-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267228 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-cnibin\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267259 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-k8s-cni-cncf-io\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267322 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-hostroot\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.267380 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-socket-dir-parent\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268295 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268363 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-cni-multus\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268464 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-system-cni-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268532 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-etc-kubernetes\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268564 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-multus-certs\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268817 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-cni-bin\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268876 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-os-release\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268905 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/12f22b8b-b227-48b3-b1f1-322dfe40e383-rootfs\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.268946 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-cnibin\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.269012 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-os-release\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.269046 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-run-netns\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.269059 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-cni-dir\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.269096 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc1cbff7-1f6e-4717-91f1-02477203145c-system-cni-dir\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.269105 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3827ef7b-1abd-4dea-acf3-474eed7b3860-host-var-lib-kubelet\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.270340 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc1cbff7-1f6e-4717-91f1-02477203145c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.270837 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3827ef7b-1abd-4dea-acf3-474eed7b3860-multus-daemon-config\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.272767 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/12f22b8b-b227-48b3-b1f1-322dfe40e383-mcd-auth-proxy-config\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.273418 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc1cbff7-1f6e-4717-91f1-02477203145c-cni-binary-copy\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.274644 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3827ef7b-1abd-4dea-acf3-474eed7b3860-cni-binary-copy\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.279830 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.286437 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ln64\" (UniqueName: \"kubernetes.io/projected/12f22b8b-b227-48b3-b1f1-322dfe40e383-kube-api-access-7ln64\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.286557 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12f22b8b-b227-48b3-b1f1-322dfe40e383-proxy-tls\") pod \"machine-config-daemon-csjgs\" (UID: \"12f22b8b-b227-48b3-b1f1-322dfe40e383\") " pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.290697 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z597f\" (UniqueName: \"kubernetes.io/projected/3827ef7b-1abd-4dea-acf3-474eed7b3860-kube-api-access-z597f\") pod \"multus-2hhn6\" (UID: \"3827ef7b-1abd-4dea-acf3-474eed7b3860\") " pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.291562 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qktb2\" (UniqueName: \"kubernetes.io/projected/fc1cbff7-1f6e-4717-91f1-02477203145c-kube-api-access-qktb2\") pod \"multus-additional-cni-plugins-9zrff\" (UID: \"fc1cbff7-1f6e-4717-91f1-02477203145c\") " pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.296388 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.309195 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.314812 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.314983 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.315064 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.315148 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.315226 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.329643 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.343735 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.361378 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.373020 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.384897 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.395256 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.396078 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8lrw5" event={"ID":"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5","Type":"ContainerStarted","Data":"0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.396132 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8lrw5" event={"ID":"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5","Type":"ContainerStarted","Data":"7dcd037efcdc1eda8638cc42591e3cdee532221bdd7d17303fa45fcac587d030"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.409909 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.418950 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.418987 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.418998 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.419014 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.419026 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.420389 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.430571 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.442803 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.452080 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.452377 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2hhn6" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.465603 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.470385 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.479336 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.481494 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9zrff" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.494705 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.505674 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsn92"] Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.506971 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: W0311 18:51:01.509774 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc1cbff7_1f6e_4717_91f1_02477203145c.slice/crio-b55e8d3deeb3c6e12cdcf98b1e0f9c7304ef8627be639f419b3bbd4e7bf03db8 WatchSource:0}: Error finding container b55e8d3deeb3c6e12cdcf98b1e0f9c7304ef8627be639f419b3bbd4e7bf03db8: Status 404 returned error can't find the container with id b55e8d3deeb3c6e12cdcf98b1e0f9c7304ef8627be639f419b3bbd4e7bf03db8 Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.512025 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.512533 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.512571 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.512741 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.512778 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.512800 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.512986 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.513953 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.522346 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.522410 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.522429 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.522454 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.522480 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.529114 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.541377 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.554963 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.568358 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570116 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-685hx\" (UniqueName: \"kubernetes.io/projected/5c32da15-9b98-45c1-be42-d7d0e89428c5-kube-api-access-685hx\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570149 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-config\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570169 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-log-socket\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570185 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570203 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-systemd-units\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570221 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-ovn\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570240 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-bin\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570263 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-kubelet\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570302 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-etc-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570319 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-netd\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570351 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-slash\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570376 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-node-log\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570399 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-env-overrides\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570421 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570442 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570471 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-netns\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570486 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-systemd\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570501 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovn-node-metrics-cert\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570516 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-var-lib-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.570530 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-script-lib\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.581126 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.593904 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.615860 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.624589 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.624618 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.624629 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.624650 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.624661 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.628886 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.643507 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.657876 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.670939 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-systemd\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671106 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovn-node-metrics-cert\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671175 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-var-lib-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671255 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-script-lib\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671373 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-685hx\" (UniqueName: \"kubernetes.io/projected/5c32da15-9b98-45c1-be42-d7d0e89428c5-kube-api-access-685hx\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671459 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-config\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671531 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-log-socket\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671594 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671657 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-bin\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671728 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-systemd-units\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671750 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-ovn\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671766 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-kubelet\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671783 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-etc-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671798 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-netd\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671818 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-env-overrides\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671834 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-slash\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671847 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-node-log\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671860 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671875 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671887 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-netns\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671953 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-netns\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.671984 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-systemd\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673143 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-env-overrides\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673200 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-kubelet\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673236 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-etc-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673285 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-netd\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673324 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-node-log\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673357 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-slash\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673389 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673396 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-ovn\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673620 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-log-socket\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673650 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673672 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-bin\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673692 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-systemd-units\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673947 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-config\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.673994 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-var-lib-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.674509 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-script-lib\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.674771 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-openvswitch\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.675340 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.675686 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovn-node-metrics-cert\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.688010 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-685hx\" (UniqueName: \"kubernetes.io/projected/5c32da15-9b98-45c1-be42-d7d0e89428c5-kube-api-access-685hx\") pod \"ovnkube-node-xsn92\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.694568 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.711621 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.727051 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.727088 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.727097 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.727115 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.727125 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.730077 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.747615 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.763415 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.775445 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.787754 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.800167 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.809425 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.821633 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.830092 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.830168 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.830185 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.830203 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.830213 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.833137 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.841735 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.847959 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: W0311 18:51:01.855659 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c32da15_9b98_45c1_be42_d7d0e89428c5.slice/crio-6fcd2e0332fdba0c5fb56f8075f1798a46587dde93fb7f43b4cf9bf41603bead WatchSource:0}: Error finding container 6fcd2e0332fdba0c5fb56f8075f1798a46587dde93fb7f43b4cf9bf41603bead: Status 404 returned error can't find the container with id 6fcd2e0332fdba0c5fb56f8075f1798a46587dde93fb7f43b4cf9bf41603bead Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.860365 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.882589 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:01Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.934591 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.934638 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.934649 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.934666 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:01 crc kubenswrapper[4842]: I0311 18:51:01.934675 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:01Z","lastTransitionTime":"2026-03-11T18:51:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.037695 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.037757 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.037782 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.037812 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.037834 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.140947 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.141152 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.141188 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.141224 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.141250 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.245077 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.245165 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.245189 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.245226 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.245249 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.349010 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.349109 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.349139 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.349190 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.349217 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.411760 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5" exitCode=0 Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.411872 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.411981 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"6fcd2e0332fdba0c5fb56f8075f1798a46587dde93fb7f43b4cf9bf41603bead"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.417788 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.417861 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.417886 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"1ba2a19924656a5c47b95d4fa7bd726a4e2cd1882021527f47ec0968754370d5"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.426439 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc1cbff7-1f6e-4717-91f1-02477203145c" containerID="4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051" exitCode=0 Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.426617 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerDied","Data":"4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.426706 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerStarted","Data":"b55e8d3deeb3c6e12cdcf98b1e0f9c7304ef8627be639f419b3bbd4e7bf03db8"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.432010 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerStarted","Data":"caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.432091 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerStarted","Data":"14a88249bb8d05f78151e9c7435999973dafc2e791e3fc822957a5dfe59d8780"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.434863 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.453016 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.453640 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.453656 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.453680 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.453697 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.467043 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.493305 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.515171 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.531361 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.550752 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.555886 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.555928 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.555940 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.555960 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.555972 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.568398 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.584383 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.596996 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.610313 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.622902 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.641584 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.654382 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.660799 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.660845 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.660903 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.660926 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.660939 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.671369 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.683295 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.698727 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.711305 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.725246 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.744743 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.764662 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.764715 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.764728 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.764751 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.764766 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.765645 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.776484 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.776518 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.776527 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.776542 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.776551 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.780259 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.791417 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.797232 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.798475 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.798527 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.798539 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.798559 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.798571 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.814463 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.816394 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.818132 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.818164 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.818175 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.818193 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.818205 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.826920 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.831567 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.837537 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.837590 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.837601 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.837622 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.837635 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.850248 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.852008 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.853716 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.853751 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.853764 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.853790 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.853805 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.866213 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.869897 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.870024 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.872381 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.872424 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.872439 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.872460 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.872472 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.880137 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.892670 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:02Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.961365 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.961492 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.961376 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.961760 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.961745 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:02 crc kubenswrapper[4842]: E0311 18:51:02.962089 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.973829 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.974039 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.974117 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.974204 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:02 crc kubenswrapper[4842]: I0311 18:51:02.974323 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:02Z","lastTransitionTime":"2026-03-11T18:51:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.076388 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.076421 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.076432 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.076448 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.076458 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.179314 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.179373 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.179384 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.179402 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.179414 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.283100 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.283506 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.283520 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.283538 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.283548 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.387375 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.387431 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.387443 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.387470 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.387485 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.438227 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.438301 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.438317 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.438328 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.438338 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.440429 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerStarted","Data":"7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.462950 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.475241 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.486324 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.490720 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.490767 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.490779 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.490795 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.490804 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.499165 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.510012 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.527906 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.539692 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.553423 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.565612 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.581591 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.593174 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.593206 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.593217 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.593234 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.593246 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.595054 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.603844 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.621944 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.635896 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:03Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.695505 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.696166 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.696194 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.696214 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.696225 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.803779 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.803824 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.803834 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.803850 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.803864 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.906372 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.906418 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.906427 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.906443 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:03 crc kubenswrapper[4842]: I0311 18:51:03.906452 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:03Z","lastTransitionTime":"2026-03-11T18:51:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.009486 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.009544 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.009561 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.009587 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.009604 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.112415 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.112461 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.112470 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.112488 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.112498 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.215453 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.215514 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.215526 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.215548 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.215563 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.318153 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.318203 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.318222 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.318247 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.318265 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.420899 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.420934 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.420942 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.420958 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.420969 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.446687 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc1cbff7-1f6e-4717-91f1-02477203145c" containerID="7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7" exitCode=0 Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.446768 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerDied","Data":"7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.453231 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.462071 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.478183 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.491101 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.503497 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.521387 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.523508 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.523541 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.523552 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.523596 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.523605 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.534564 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.546371 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.565478 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.576057 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.592730 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.601864 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.611505 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.621980 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.625869 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.625910 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.625920 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.625936 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.625945 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.637319 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.728387 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.728425 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.728434 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.728450 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.728459 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.830318 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.830381 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.830399 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.830425 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.830442 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.933426 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.933522 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.933541 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.933567 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.933627 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:04Z","lastTransitionTime":"2026-03-11T18:51:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.961959 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.962004 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.962074 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:04 crc kubenswrapper[4842]: E0311 18:51:04.962154 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:04 crc kubenswrapper[4842]: E0311 18:51:04.962249 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:04 crc kubenswrapper[4842]: E0311 18:51:04.962421 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.980210 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:04 crc kubenswrapper[4842]: I0311 18:51:04.996198 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:04Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.008850 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.043212 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.048154 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.048207 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.048222 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.048249 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.048267 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.077679 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.098390 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.118066 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.128128 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.141423 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.150645 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.150695 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.150710 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.150731 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.150749 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.153126 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.164541 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.177943 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.191887 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.201860 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.253258 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.253482 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.253497 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.253516 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.253527 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.356052 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.356105 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.356122 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.356146 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.356162 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.458180 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.458236 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.458259 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.458341 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.458368 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.460104 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc1cbff7-1f6e-4717-91f1-02477203145c" containerID="0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618" exitCode=0 Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.460163 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerDied","Data":"0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.483942 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.506047 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.525022 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.539095 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.552420 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.562857 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.562933 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.562957 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.562991 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.563015 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.565372 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.579930 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.594490 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.606316 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.618231 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.632301 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.652371 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.667204 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.668369 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.668412 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.668423 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.668439 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.668450 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.688891 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:05Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.771291 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.771325 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.771336 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.771357 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.771369 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.874833 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.874881 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.874895 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.874917 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.874933 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.977527 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.977577 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.977588 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.977605 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:05 crc kubenswrapper[4842]: I0311 18:51:05.977619 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:05Z","lastTransitionTime":"2026-03-11T18:51:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.080520 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.080583 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.080594 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.080613 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.080629 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.183264 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.183335 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.183352 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.183374 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.183412 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.286089 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.286131 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.286141 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.286158 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.286170 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.388878 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.388940 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.388952 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.388972 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.388985 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.471176 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc1cbff7-1f6e-4717-91f1-02477203145c" containerID="165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d" exitCode=0 Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.471293 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerDied","Data":"165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.476468 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.491189 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.491225 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.491238 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.491257 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.491295 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.512658 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.526727 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.538696 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.569791 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.595038 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.595348 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.595403 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.595424 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.595784 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.595807 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.608920 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.622494 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.637386 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.650853 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.664309 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.677619 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.687869 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.700541 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.700580 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.700591 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.700609 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.700621 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.701390 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.722633 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:06Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.803981 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.804049 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.804068 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.804202 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.804223 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.907348 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.907407 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.907421 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.907450 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.907465 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:06Z","lastTransitionTime":"2026-03-11T18:51:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.961252 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.961388 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:06 crc kubenswrapper[4842]: I0311 18:51:06.961458 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:06 crc kubenswrapper[4842]: E0311 18:51:06.961557 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:06 crc kubenswrapper[4842]: E0311 18:51:06.961705 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:06 crc kubenswrapper[4842]: E0311 18:51:06.961906 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.010334 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.010406 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.010435 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.010474 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.010495 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.114095 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.114146 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.114158 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.114179 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.114195 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.218181 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.218379 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.218399 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.218523 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.218555 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.322366 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.322453 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.322478 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.322511 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.322534 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.425533 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.425594 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.425610 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.425631 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.425648 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.454801 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-nrlmz"] Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.455247 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.460141 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.462046 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.462440 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.462942 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.481840 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.486939 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerStarted","Data":"62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.500727 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.525752 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.534644 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.534711 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.534729 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.534755 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.534775 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.534870 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-host\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.534919 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-serviceca\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.535051 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qpr\" (UniqueName: \"kubernetes.io/projected/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-kube-api-access-v9qpr\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.544846 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.567095 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.590850 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.610074 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.626139 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.635955 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-host\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.636026 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-serviceca\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.636085 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-host\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.636103 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qpr\" (UniqueName: \"kubernetes.io/projected/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-kube-api-access-v9qpr\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.638837 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-serviceca\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.642580 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.642662 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.642675 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.642699 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.642713 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.643547 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.663976 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.674818 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qpr\" (UniqueName: \"kubernetes.io/projected/b857b87a-cc03-4a79-8042-f3a7cf84f8c2-kube-api-access-v9qpr\") pod \"node-ca-nrlmz\" (UID: \"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\") " pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.678225 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.697681 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.715148 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.741384 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.748322 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.748736 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.748758 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.748778 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.748791 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.760679 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.777036 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.792478 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.805711 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.820114 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.836048 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.850108 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.854343 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.854380 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.854390 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.854412 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.854425 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.857037 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nrlmz" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.870222 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.888207 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.908318 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.928192 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.943758 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.958460 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.958502 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.958511 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.958529 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.958539 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:07Z","lastTransitionTime":"2026-03-11T18:51:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.962540 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.962536 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.976553 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:07 crc kubenswrapper[4842]: I0311 18:51:07.992503 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:07Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.005262 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.062832 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.062879 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.062897 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.062923 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.062943 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.166035 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.166511 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.166526 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.166545 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.166559 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.268971 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.269016 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.269030 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.269052 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.269064 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.371419 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.371494 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.371506 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.371525 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.371538 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.474111 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.474152 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.474164 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.474182 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.474191 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.492424 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc1cbff7-1f6e-4717-91f1-02477203145c" containerID="62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860" exitCode=0 Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.492450 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc1cbff7-1f6e-4717-91f1-02477203145c" containerID="681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758" exitCode=0 Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.492473 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerDied","Data":"62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.492548 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerDied","Data":"681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.496763 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.498691 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.499067 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.502434 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.502683 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.502706 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.502836 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.505171 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nrlmz" event={"ID":"b857b87a-cc03-4a79-8042-f3a7cf84f8c2","Type":"ContainerStarted","Data":"02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.505220 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nrlmz" event={"ID":"b857b87a-cc03-4a79-8042-f3a7cf84f8c2","Type":"ContainerStarted","Data":"e233b623207221bb4f528467546e6feb3f798613c8d23d81cf5cc2098f2e1f7c"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.527764 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.528939 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.538850 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.550137 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.561839 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.574602 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.583758 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.583833 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.583845 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.583864 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.583873 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.590986 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.602961 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.621089 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.634029 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.647434 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.660562 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.670566 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.684861 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.688010 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.688048 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.688062 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.689220 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.689243 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.703787 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.715886 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.728068 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.743540 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.764409 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.785117 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.792197 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.792242 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.792255 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.792299 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.792314 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.803243 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.819824 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.836616 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.852009 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.866097 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.878067 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.890809 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.894369 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.894409 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.894419 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.894436 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.894447 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.907037 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.918920 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.934618 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.948299 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.962143 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.962259 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:08 crc kubenswrapper[4842]: E0311 18:51:08.962317 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:08 crc kubenswrapper[4842]: E0311 18:51:08.962458 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.962558 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:08 crc kubenswrapper[4842]: E0311 18:51:08.962615 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.971841 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:08Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.996874 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.996914 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.996924 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.996938 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:08 crc kubenswrapper[4842]: I0311 18:51:08.996948 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:08Z","lastTransitionTime":"2026-03-11T18:51:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.100144 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.100178 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.100189 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.100204 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.100214 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.203484 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.203531 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.203542 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.203565 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.203581 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.306406 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.306497 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.306515 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.306545 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.306564 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.409987 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.410066 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.410093 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.410171 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.410201 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.513658 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.513749 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.513771 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.513805 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.513836 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.522768 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" event={"ID":"fc1cbff7-1f6e-4717-91f1-02477203145c","Type":"ContainerStarted","Data":"d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.546317 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.567974 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.591852 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.614335 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.617680 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.617779 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.617800 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.617859 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.617880 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.631363 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.664651 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.682075 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.697856 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.715420 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.721314 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.721411 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.721437 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.721471 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.721495 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.735340 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.751727 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.773386 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.789968 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.819832 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.824875 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.824939 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.824957 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.824983 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.824998 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.834902 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:09Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.928105 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.928160 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.928171 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.928190 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:09 crc kubenswrapper[4842]: I0311 18:51:09.928205 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:09Z","lastTransitionTime":"2026-03-11T18:51:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.030935 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.030991 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.031012 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.031039 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.031058 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.133599 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.133649 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.133665 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.133689 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.133709 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.236592 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.236641 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.236661 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.236686 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.236706 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.339166 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.339198 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.339208 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.339224 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.339234 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.441627 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.441658 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.441665 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.441681 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.441691 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.544481 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.544515 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.544523 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.544538 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.544548 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.647678 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.647714 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.647723 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.647740 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.647749 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.750799 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.751659 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.751742 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.752336 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.752515 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.856310 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.856365 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.856377 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.856398 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.856411 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.959967 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.960407 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.960626 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.960782 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.960909 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:10Z","lastTransitionTime":"2026-03-11T18:51:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.961395 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.961675 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:10 crc kubenswrapper[4842]: E0311 18:51:10.962033 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:10 crc kubenswrapper[4842]: E0311 18:51:10.961674 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:10 crc kubenswrapper[4842]: I0311 18:51:10.961515 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:10 crc kubenswrapper[4842]: E0311 18:51:10.962142 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.065088 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.065489 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.065712 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.065916 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.066070 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:11Z","lastTransitionTime":"2026-03-11T18:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.170267 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.170356 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.170374 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.170400 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:11 crc kubenswrapper[4842]: I0311 18:51:11.170420 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:11Z","lastTransitionTime":"2026-03-11T18:51:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.163322 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.163391 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.163417 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.163436 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.163472 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.182061 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/0.log" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.186647 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864" exitCode=1 Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.186857 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:12 crc kubenswrapper[4842]: E0311 18:51:12.187044 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.187451 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.188409 4842 scope.go:117] "RemoveContainer" containerID="dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.187473 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:12 crc kubenswrapper[4842]: E0311 18:51:12.189775 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.206648 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.240287 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:10Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877623 6712 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877762 6712 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877841 6712 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877977 6712 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.878630 6712 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 18:51:10.878657 6712 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 18:51:10.878676 6712 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 18:51:10.878703 6712 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 18:51:10.878733 6712 factory.go:656] Stopping watch factory\\\\nI0311 18:51:10.878741 6712 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 18:51:10.878748 6712 ovnkube.go:599] Stopped ovnkube\\\\nI0311 18:51:10.878761 6712 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 18:51:10.878773 6712 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.253299 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.266920 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.267108 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.267169 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.267230 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.267297 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.279070 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.294512 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.310827 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.330167 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.346063 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.363789 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.369632 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.369658 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.369667 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.369681 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.369691 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.382141 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.392871 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.406155 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.424727 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.438818 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.455571 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:12Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.472425 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.472459 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.472470 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.472486 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.472495 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.575531 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.575569 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.575580 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.575596 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.575606 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.678635 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.678680 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.678693 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.678716 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.678731 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.781789 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.781846 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.781867 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.781896 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.781942 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.885929 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.885980 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.886000 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.886031 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.886051 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.964545 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:12 crc kubenswrapper[4842]: E0311 18:51:12.964700 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.989497 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.989595 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.989614 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.989645 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:12 crc kubenswrapper[4842]: I0311 18:51:12.989665 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:12Z","lastTransitionTime":"2026-03-11T18:51:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.022647 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.022698 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.022712 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.022735 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.022749 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.041519 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.045883 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.045946 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.045964 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.045992 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.046010 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.064196 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.068472 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.068525 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.068538 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.068562 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.068576 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.089200 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.092595 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.092634 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.092645 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.092662 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.092675 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.110194 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.113673 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.113728 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.113744 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.113764 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.113777 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.127358 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.127503 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.129105 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.129216 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.129341 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.129448 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.129558 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.191954 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/0.log" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.194233 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.194641 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.207906 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.220911 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.232378 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.232412 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.232421 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.232438 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.232451 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.233455 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.244739 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.312459 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.325572 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.335178 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.335209 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.335220 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.335239 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.335251 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.337400 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.354461 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.366125 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.377307 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.390351 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.399718 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.408577 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm"] Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.409037 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.411539 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.411834 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.412113 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.425672 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.437547 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.437582 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.437592 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.437608 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.437617 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.449039 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:10Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877623 6712 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877762 6712 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877841 6712 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877977 6712 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.878630 6712 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 18:51:10.878657 6712 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 18:51:10.878676 6712 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 18:51:10.878703 6712 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 18:51:10.878733 6712 factory.go:656] Stopping watch factory\\\\nI0311 18:51:10.878741 6712 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 18:51:10.878748 6712 ovnkube.go:599] Stopped ovnkube\\\\nI0311 18:51:10.878761 6712 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 18:51:10.878773 6712 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.461356 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.474887 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.486683 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.502282 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.513297 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c99a09c4-b942-4f40-abe9-16b91e662d2b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.513399 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c99a09c4-b942-4f40-abe9-16b91e662d2b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.513432 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4kg\" (UniqueName: \"kubernetes.io/projected/c99a09c4-b942-4f40-abe9-16b91e662d2b-kube-api-access-bn4kg\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.513454 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c99a09c4-b942-4f40-abe9-16b91e662d2b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.518447 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.536661 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:10Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877623 6712 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877762 6712 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877841 6712 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877977 6712 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.878630 6712 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 18:51:10.878657 6712 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 18:51:10.878676 6712 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 18:51:10.878703 6712 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 18:51:10.878733 6712 factory.go:656] Stopping watch factory\\\\nI0311 18:51:10.878741 6712 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 18:51:10.878748 6712 ovnkube.go:599] Stopped ovnkube\\\\nI0311 18:51:10.878761 6712 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 18:51:10.878773 6712 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.540379 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.540421 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.540434 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.540454 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.540467 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.549162 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.571619 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.585649 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.596138 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.614857 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c99a09c4-b942-4f40-abe9-16b91e662d2b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.614924 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c99a09c4-b942-4f40-abe9-16b91e662d2b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.614954 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4kg\" (UniqueName: \"kubernetes.io/projected/c99a09c4-b942-4f40-abe9-16b91e662d2b-kube-api-access-bn4kg\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.614982 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c99a09c4-b942-4f40-abe9-16b91e662d2b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.615622 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c99a09c4-b942-4f40-abe9-16b91e662d2b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.615720 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.615758 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c99a09c4-b942-4f40-abe9-16b91e662d2b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.626864 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c99a09c4-b942-4f40-abe9-16b91e662d2b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.629977 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.640882 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.642107 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4kg\" (UniqueName: \"kubernetes.io/projected/c99a09c4-b942-4f40-abe9-16b91e662d2b-kube-api-access-bn4kg\") pod \"ovnkube-control-plane-749d76644c-5dxdm\" (UID: \"c99a09c4-b942-4f40-abe9-16b91e662d2b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.642415 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.642445 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.642459 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.642479 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.642492 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.659374 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.671074 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.681303 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:13Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.721830 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" Mar 11 18:51:13 crc kubenswrapper[4842]: W0311 18:51:13.740353 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc99a09c4_b942_4f40_abe9_16b91e662d2b.slice/crio-8c03c052aa3b84a17ef27d6986069e4aa175c6fe5b87b39da3f32f4c9f5686d0 WatchSource:0}: Error finding container 8c03c052aa3b84a17ef27d6986069e4aa175c6fe5b87b39da3f32f4c9f5686d0: Status 404 returned error can't find the container with id 8c03c052aa3b84a17ef27d6986069e4aa175c6fe5b87b39da3f32f4c9f5686d0 Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.748037 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.748078 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.748089 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.748162 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.748202 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.851839 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.851900 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.851919 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.851946 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.851963 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.954547 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.954590 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.954599 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.954616 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.954627 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:13Z","lastTransitionTime":"2026-03-11T18:51:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.961889 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:13 crc kubenswrapper[4842]: I0311 18:51:13.961951 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.962013 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:13 crc kubenswrapper[4842]: E0311 18:51:13.962123 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.056990 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.057219 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.057231 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.057247 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.057258 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.162423 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.162465 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.162478 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.162497 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.162510 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.168673 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8vd7m"] Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.169564 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.169668 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.189569 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.201661 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/1.log" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.203735 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/0.log" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.204678 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.210820 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a" exitCode=1 Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.210899 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.210942 4842 scope.go:117] "RemoveContainer" containerID="dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.211799 4842 scope.go:117] "RemoveContainer" containerID="55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a" Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.211987 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.214616 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" event={"ID":"c99a09c4-b942-4f40-abe9-16b91e662d2b","Type":"ContainerStarted","Data":"0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.214666 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" event={"ID":"c99a09c4-b942-4f40-abe9-16b91e662d2b","Type":"ContainerStarted","Data":"a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.214681 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" event={"ID":"c99a09c4-b942-4f40-abe9-16b91e662d2b","Type":"ContainerStarted","Data":"8c03c052aa3b84a17ef27d6986069e4aa175c6fe5b87b39da3f32f4c9f5686d0"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.219783 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x246p\" (UniqueName: \"kubernetes.io/projected/a7a00900-ec76-49e4-9485-131830a0611e-kube-api-access-x246p\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.219906 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.225515 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.243845 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.264849 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.264891 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.264903 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.264922 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.264933 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.267710 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:10Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877623 6712 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877762 6712 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877841 6712 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877977 6712 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.878630 6712 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 18:51:10.878657 6712 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 18:51:10.878676 6712 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 18:51:10.878703 6712 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 18:51:10.878733 6712 factory.go:656] Stopping watch factory\\\\nI0311 18:51:10.878741 6712 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 18:51:10.878748 6712 ovnkube.go:599] Stopped ovnkube\\\\nI0311 18:51:10.878761 6712 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 18:51:10.878773 6712 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.282060 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.296364 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.309192 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.320348 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.320458 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x246p\" (UniqueName: \"kubernetes.io/projected/a7a00900-ec76-49e4-9485-131830a0611e-kube-api-access-x246p\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.320496 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.320551 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:51:14.820533431 +0000 UTC m=+120.468229711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.320478 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.331342 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.339149 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x246p\" (UniqueName: \"kubernetes.io/projected/a7a00900-ec76-49e4-9485-131830a0611e-kube-api-access-x246p\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.344341 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.356730 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.366784 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.367017 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.367102 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.367192 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.367264 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.367654 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.395417 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.409343 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.422530 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.431968 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.442610 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.454994 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.470082 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.470128 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.470141 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.470161 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.470173 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.483947 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:10Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877623 6712 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877762 6712 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877841 6712 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877977 6712 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.878630 6712 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 18:51:10.878657 6712 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 18:51:10.878676 6712 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 18:51:10.878703 6712 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 18:51:10.878733 6712 factory.go:656] Stopping watch factory\\\\nI0311 18:51:10.878741 6712 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 18:51:10.878748 6712 ovnkube.go:599] Stopped ovnkube\\\\nI0311 18:51:10.878761 6712 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 18:51:10.878773 6712 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"rplf openshift-dns/node-resolver-8lrw5 openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/machine-config-daemon-csjgs openshift-multus/multus-2hhn6 openshift-multus/multus-additional-cni-plugins-9zrff openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-xsn92]\\\\nI0311 18:51:13.556047 6917 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0311 18:51:13.556061 6917 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556069 6917 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556078 6917 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 in node crc\\\\nI0311 18:51:13.556085 6917 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 after 0 failed attempt(s)\\\\nI0311 18:51:13.556090 6917 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556104 6917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 18:51:13.556162 6917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.502877 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.516665 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.532351 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.543690 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.553596 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.569062 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.572386 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.572422 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.572435 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.572453 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.572464 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.581373 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.592456 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.608235 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.618312 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.633189 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.649056 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.658766 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.671068 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.675016 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.675039 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.675051 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.675067 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.675077 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.777476 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.777511 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.777524 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.777542 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.777553 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.826978 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.827250 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:51:46.827205983 +0000 UTC m=+152.474902303 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.827464 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.827728 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.827836 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:51:15.827804689 +0000 UTC m=+121.475501009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.879707 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.879914 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.879996 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.880111 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.880203 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:14Z","lastTransitionTime":"2026-03-11T18:51:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.928904 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.929208 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.929353 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.929499 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.929124 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.929379 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.929873 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.929952 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.929596 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.929608 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.930186 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.930210 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.930397 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:46.92975689 +0000 UTC m=+152.577453170 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.930514 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:46.930479949 +0000 UTC m=+152.578176269 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.930552 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:46.930537611 +0000 UTC m=+152.578233931 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.930586 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:51:46.930573562 +0000 UTC m=+152.578269882 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.961286 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.962869 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.978504 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:14 crc kubenswrapper[4842]: E0311 18:51:14.980490 4842 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 11 18:51:14 crc kubenswrapper[4842]: I0311 18:51:14.990542 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:14Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.021414 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.038694 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: E0311 18:51:15.055404 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.059960 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.078162 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.093880 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.106236 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.116801 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.128429 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.140466 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.149810 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.158912 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.175767 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.192655 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.213112 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd3f278bdd1850174573ee664f000a62b58f31b632358d76f7466641119a9864\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:10Z\\\",\\\"message\\\":\\\"0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877623 6712 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877762 6712 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877841 6712 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.877977 6712 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 18:51:10.878630 6712 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 18:51:10.878657 6712 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 18:51:10.878676 6712 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 18:51:10.878703 6712 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 18:51:10.878733 6712 factory.go:656] Stopping watch factory\\\\nI0311 18:51:10.878741 6712 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 18:51:10.878748 6712 ovnkube.go:599] Stopped ovnkube\\\\nI0311 18:51:10.878761 6712 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 18:51:10.878773 6712 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"rplf openshift-dns/node-resolver-8lrw5 openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/machine-config-daemon-csjgs openshift-multus/multus-2hhn6 openshift-multus/multus-additional-cni-plugins-9zrff openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-xsn92]\\\\nI0311 18:51:13.556047 6917 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0311 18:51:13.556061 6917 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556069 6917 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556078 6917 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 in node crc\\\\nI0311 18:51:13.556085 6917 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 after 0 failed attempt(s)\\\\nI0311 18:51:13.556090 6917 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556104 6917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 18:51:13.556162 6917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.221432 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/1.log" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.225885 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.226724 4842 scope.go:117] "RemoveContainer" containerID="55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a" Mar 11 18:51:15 crc kubenswrapper[4842]: E0311 18:51:15.227014 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.235538 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.248431 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.260662 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.274942 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.285736 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.300423 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"rplf openshift-dns/node-resolver-8lrw5 openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/machine-config-daemon-csjgs openshift-multus/multus-2hhn6 openshift-multus/multus-additional-cni-plugins-9zrff openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-xsn92]\\\\nI0311 18:51:13.556047 6917 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0311 18:51:13.556061 6917 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556069 6917 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556078 6917 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 in node crc\\\\nI0311 18:51:13.556085 6917 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 after 0 failed attempt(s)\\\\nI0311 18:51:13.556090 6917 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556104 6917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 18:51:13.556162 6917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.310724 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.320698 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.337404 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.348838 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.359106 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.369488 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.379674 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.388183 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.398228 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.408714 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.416794 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:15Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.837808 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:15 crc kubenswrapper[4842]: E0311 18:51:15.838091 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:15 crc kubenswrapper[4842]: E0311 18:51:15.838262 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:51:17.838217575 +0000 UTC m=+123.485914035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.962451 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.962453 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:15 crc kubenswrapper[4842]: E0311 18:51:15.962934 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:15 crc kubenswrapper[4842]: I0311 18:51:15.962566 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:15 crc kubenswrapper[4842]: E0311 18:51:15.963117 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:15 crc kubenswrapper[4842]: E0311 18:51:15.963542 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:16 crc kubenswrapper[4842]: I0311 18:51:16.961698 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:16 crc kubenswrapper[4842]: E0311 18:51:16.963119 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:17 crc kubenswrapper[4842]: I0311 18:51:17.858640 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:17 crc kubenswrapper[4842]: E0311 18:51:17.858961 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:17 crc kubenswrapper[4842]: E0311 18:51:17.859322 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:51:21.859257342 +0000 UTC m=+127.506953662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:17 crc kubenswrapper[4842]: I0311 18:51:17.961247 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:17 crc kubenswrapper[4842]: I0311 18:51:17.961386 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:17 crc kubenswrapper[4842]: E0311 18:51:17.961465 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:17 crc kubenswrapper[4842]: E0311 18:51:17.961609 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:17 crc kubenswrapper[4842]: I0311 18:51:17.961248 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:17 crc kubenswrapper[4842]: E0311 18:51:17.961768 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:18 crc kubenswrapper[4842]: I0311 18:51:18.962666 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:18 crc kubenswrapper[4842]: E0311 18:51:18.962987 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:19 crc kubenswrapper[4842]: I0311 18:51:19.961709 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:19 crc kubenswrapper[4842]: I0311 18:51:19.961781 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:19 crc kubenswrapper[4842]: E0311 18:51:19.961913 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:19 crc kubenswrapper[4842]: I0311 18:51:19.962245 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:19 crc kubenswrapper[4842]: E0311 18:51:19.962399 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:19 crc kubenswrapper[4842]: E0311 18:51:19.962699 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:20 crc kubenswrapper[4842]: E0311 18:51:20.057247 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:20 crc kubenswrapper[4842]: I0311 18:51:20.961622 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:20 crc kubenswrapper[4842]: E0311 18:51:20.961863 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:21 crc kubenswrapper[4842]: I0311 18:51:21.911074 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:21 crc kubenswrapper[4842]: E0311 18:51:21.911376 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:21 crc kubenswrapper[4842]: E0311 18:51:21.912865 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:51:29.912837835 +0000 UTC m=+135.560534115 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:21 crc kubenswrapper[4842]: I0311 18:51:21.962203 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:21 crc kubenswrapper[4842]: I0311 18:51:21.962203 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:21 crc kubenswrapper[4842]: I0311 18:51:21.962929 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:21 crc kubenswrapper[4842]: E0311 18:51:21.963139 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:21 crc kubenswrapper[4842]: E0311 18:51:21.963396 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:21 crc kubenswrapper[4842]: E0311 18:51:21.963525 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:22 crc kubenswrapper[4842]: I0311 18:51:22.961710 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:22 crc kubenswrapper[4842]: E0311 18:51:22.961914 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.350624 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.350661 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.350670 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.350685 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.350694 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:23Z","lastTransitionTime":"2026-03-11T18:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.361488 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:23Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.364443 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.364489 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.364503 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.364526 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.364541 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:23Z","lastTransitionTime":"2026-03-11T18:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.374776 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:23Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.378586 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.378631 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.378642 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.378660 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.378672 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:23Z","lastTransitionTime":"2026-03-11T18:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.390341 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:23Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.393176 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.393210 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.393219 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.393235 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.393244 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:23Z","lastTransitionTime":"2026-03-11T18:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.405174 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:23Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.408670 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.408719 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.408731 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.408750 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.408762 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:23Z","lastTransitionTime":"2026-03-11T18:51:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.419087 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:23Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.419696 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.961814 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.962012 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.962612 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:23 crc kubenswrapper[4842]: I0311 18:51:23.962686 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.962824 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:23 crc kubenswrapper[4842]: E0311 18:51:23.962963 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:24 crc kubenswrapper[4842]: I0311 18:51:24.961917 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:24 crc kubenswrapper[4842]: E0311 18:51:24.962417 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:24 crc kubenswrapper[4842]: I0311 18:51:24.980974 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:24Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:24 crc kubenswrapper[4842]: I0311 18:51:24.994846 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:24Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.007747 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.038417 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: E0311 18:51:25.057865 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.068028 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.086022 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.101809 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.113431 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.130041 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.144022 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.158315 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.171832 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.184839 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.200424 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.217623 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.238197 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.241137 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"rplf openshift-dns/node-resolver-8lrw5 openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/machine-config-daemon-csjgs openshift-multus/multus-2hhn6 openshift-multus/multus-additional-cni-plugins-9zrff openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-xsn92]\\\\nI0311 18:51:13.556047 6917 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0311 18:51:13.556061 6917 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556069 6917 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556078 6917 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 in node crc\\\\nI0311 18:51:13.556085 6917 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 after 0 failed attempt(s)\\\\nI0311 18:51:13.556090 6917 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556104 6917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 18:51:13.556162 6917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.253760 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.271054 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.283231 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.295886 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.306732 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.320157 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.339972 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"rplf openshift-dns/node-resolver-8lrw5 openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/machine-config-daemon-csjgs openshift-multus/multus-2hhn6 openshift-multus/multus-additional-cni-plugins-9zrff openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-xsn92]\\\\nI0311 18:51:13.556047 6917 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0311 18:51:13.556061 6917 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556069 6917 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556078 6917 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 in node crc\\\\nI0311 18:51:13.556085 6917 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 after 0 failed attempt(s)\\\\nI0311 18:51:13.556090 6917 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556104 6917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 18:51:13.556162 6917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.351375 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.361800 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.373537 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.385251 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.395390 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.411227 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.422433 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.431253 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.445247 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.454005 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.466096 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:25Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.961970 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.963001 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:25 crc kubenswrapper[4842]: E0311 18:51:25.963309 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.963464 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:25 crc kubenswrapper[4842]: E0311 18:51:25.963644 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:25 crc kubenswrapper[4842]: E0311 18:51:25.964117 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:25 crc kubenswrapper[4842]: I0311 18:51:25.975867 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 18:51:26 crc kubenswrapper[4842]: I0311 18:51:26.962166 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:26 crc kubenswrapper[4842]: E0311 18:51:26.962319 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:26 crc kubenswrapper[4842]: I0311 18:51:26.963720 4842 scope.go:117] "RemoveContainer" containerID="55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.271726 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/1.log" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.275410 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146"} Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.276077 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.289962 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.304786 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.318364 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.353165 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.381780 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.404003 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"rplf openshift-dns/node-resolver-8lrw5 openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/machine-config-daemon-csjgs openshift-multus/multus-2hhn6 openshift-multus/multus-additional-cni-plugins-9zrff openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-xsn92]\\\\nI0311 18:51:13.556047 6917 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0311 18:51:13.556061 6917 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556069 6917 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556078 6917 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 in node crc\\\\nI0311 18:51:13.556085 6917 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 after 0 failed attempt(s)\\\\nI0311 18:51:13.556090 6917 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556104 6917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 18:51:13.556162 6917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.415434 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.426895 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.447172 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.463837 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.480641 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.496133 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.518231 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.532426 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.544751 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.557461 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.567457 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.576429 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.961826 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.961903 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:27 crc kubenswrapper[4842]: I0311 18:51:27.961942 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:27 crc kubenswrapper[4842]: E0311 18:51:27.962408 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:27 crc kubenswrapper[4842]: E0311 18:51:27.962698 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:27 crc kubenswrapper[4842]: E0311 18:51:27.962824 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.280934 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/2.log" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.281880 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/1.log" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.285463 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146" exitCode=1 Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.285509 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146"} Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.285549 4842 scope.go:117] "RemoveContainer" containerID="55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.289053 4842 scope.go:117] "RemoveContainer" containerID="d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146" Mar 11 18:51:28 crc kubenswrapper[4842]: E0311 18:51:28.289512 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.313044 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.325981 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.339811 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.359376 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.375605 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.393973 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.412894 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.438619 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.470362 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.502151 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55932cbc1f89b70f170b23dc165acd500ca69431e4ad118aff7c3ce43c5f915a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"message\\\":\\\"rplf openshift-dns/node-resolver-8lrw5 openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/machine-config-daemon-csjgs openshift-multus/multus-2hhn6 openshift-multus/multus-additional-cni-plugins-9zrff openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-xsn92]\\\\nI0311 18:51:13.556047 6917 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0311 18:51:13.556061 6917 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556069 6917 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556078 6917 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 in node crc\\\\nI0311 18:51:13.556085 6917 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-xsn92 after 0 failed attempt(s)\\\\nI0311 18:51:13.556090 6917 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-xsn92\\\\nI0311 18:51:13.556104 6917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0311 18:51:13.556162 6917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.521651 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.554741 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.578542 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.597865 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.618553 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.641327 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.659732 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.677619 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:28Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:28 crc kubenswrapper[4842]: I0311 18:51:28.961311 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:28 crc kubenswrapper[4842]: E0311 18:51:28.961447 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.290522 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/2.log" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.294623 4842 scope.go:117] "RemoveContainer" containerID="d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146" Mar 11 18:51:29 crc kubenswrapper[4842]: E0311 18:51:29.294916 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.323058 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.343632 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.361110 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.377257 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.398149 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.416821 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.433118 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.452456 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.469699 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.490105 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.502946 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.516576 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.533754 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.544437 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.558518 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.570375 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.587597 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.601356 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:29Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.961130 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.961218 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:29 crc kubenswrapper[4842]: I0311 18:51:29.961155 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:29 crc kubenswrapper[4842]: E0311 18:51:29.961250 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:29 crc kubenswrapper[4842]: E0311 18:51:29.961477 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:29 crc kubenswrapper[4842]: E0311 18:51:29.961591 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:30 crc kubenswrapper[4842]: I0311 18:51:30.001923 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:30 crc kubenswrapper[4842]: E0311 18:51:30.002036 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:30 crc kubenswrapper[4842]: E0311 18:51:30.002118 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:51:46.002078872 +0000 UTC m=+151.649775152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:30 crc kubenswrapper[4842]: E0311 18:51:30.059734 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:30 crc kubenswrapper[4842]: I0311 18:51:30.961427 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:30 crc kubenswrapper[4842]: E0311 18:51:30.961754 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:31 crc kubenswrapper[4842]: I0311 18:51:31.961789 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:31 crc kubenswrapper[4842]: I0311 18:51:31.961835 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:31 crc kubenswrapper[4842]: I0311 18:51:31.961897 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:31 crc kubenswrapper[4842]: E0311 18:51:31.961915 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:31 crc kubenswrapper[4842]: E0311 18:51:31.961999 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:31 crc kubenswrapper[4842]: E0311 18:51:31.962052 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:32 crc kubenswrapper[4842]: I0311 18:51:32.961806 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:32 crc kubenswrapper[4842]: E0311 18:51:32.961955 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.793241 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.793309 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.793323 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.793342 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.793358 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:33Z","lastTransitionTime":"2026-03-11T18:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.809762 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:33Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.813857 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.813951 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.813982 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.813995 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.814006 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:33Z","lastTransitionTime":"2026-03-11T18:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.827913 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:33Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.831195 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.831237 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.831251 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.831270 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.831298 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:33Z","lastTransitionTime":"2026-03-11T18:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.841666 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:33Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.845920 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.845977 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.845990 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.846005 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.846016 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:33Z","lastTransitionTime":"2026-03-11T18:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.861173 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:33Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.866104 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.866157 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.866171 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.866190 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.866203 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:33Z","lastTransitionTime":"2026-03-11T18:51:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.882805 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:33Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.882911 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.961606 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.961625 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.961743 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:33 crc kubenswrapper[4842]: I0311 18:51:33.961779 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.961864 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:33 crc kubenswrapper[4842]: E0311 18:51:33.961925 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:34 crc kubenswrapper[4842]: I0311 18:51:34.961567 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:34 crc kubenswrapper[4842]: E0311 18:51:34.961761 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:34 crc kubenswrapper[4842]: I0311 18:51:34.984202 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:34Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.000808 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:34Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.011550 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.023333 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.033537 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.056566 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: E0311 18:51:35.060400 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.072734 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.092180 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.108185 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.123060 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.145594 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.163673 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.182673 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.200845 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.215312 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.227751 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.257386 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.274137 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:35Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.961431 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.961499 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:35 crc kubenswrapper[4842]: I0311 18:51:35.961449 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:35 crc kubenswrapper[4842]: E0311 18:51:35.961566 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:35 crc kubenswrapper[4842]: E0311 18:51:35.961654 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:35 crc kubenswrapper[4842]: E0311 18:51:35.961688 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:36 crc kubenswrapper[4842]: I0311 18:51:36.961969 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:36 crc kubenswrapper[4842]: E0311 18:51:36.962161 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:37 crc kubenswrapper[4842]: I0311 18:51:37.961226 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:37 crc kubenswrapper[4842]: E0311 18:51:37.962753 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:37 crc kubenswrapper[4842]: I0311 18:51:37.962851 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:37 crc kubenswrapper[4842]: I0311 18:51:37.962941 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:37 crc kubenswrapper[4842]: E0311 18:51:37.963079 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:37 crc kubenswrapper[4842]: E0311 18:51:37.963229 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:38 crc kubenswrapper[4842]: I0311 18:51:38.961108 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:38 crc kubenswrapper[4842]: E0311 18:51:38.961291 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:39 crc kubenswrapper[4842]: I0311 18:51:39.961146 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:39 crc kubenswrapper[4842]: I0311 18:51:39.961159 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:39 crc kubenswrapper[4842]: I0311 18:51:39.961468 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:39 crc kubenswrapper[4842]: E0311 18:51:39.961563 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:39 crc kubenswrapper[4842]: E0311 18:51:39.961324 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:39 crc kubenswrapper[4842]: E0311 18:51:39.961724 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:40 crc kubenswrapper[4842]: E0311 18:51:40.062843 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:40 crc kubenswrapper[4842]: I0311 18:51:40.961849 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:40 crc kubenswrapper[4842]: E0311 18:51:40.962607 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:41 crc kubenswrapper[4842]: I0311 18:51:41.961958 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:41 crc kubenswrapper[4842]: E0311 18:51:41.962096 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:41 crc kubenswrapper[4842]: I0311 18:51:41.962155 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:41 crc kubenswrapper[4842]: I0311 18:51:41.962175 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:41 crc kubenswrapper[4842]: E0311 18:51:41.962220 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:41 crc kubenswrapper[4842]: E0311 18:51:41.962459 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:42 crc kubenswrapper[4842]: I0311 18:51:42.961815 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:42 crc kubenswrapper[4842]: E0311 18:51:42.962131 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:43 crc kubenswrapper[4842]: I0311 18:51:43.961099 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:43 crc kubenswrapper[4842]: E0311 18:51:43.961679 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:43 crc kubenswrapper[4842]: I0311 18:51:43.962309 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:43 crc kubenswrapper[4842]: I0311 18:51:43.965322 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:43 crc kubenswrapper[4842]: E0311 18:51:43.965370 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:43 crc kubenswrapper[4842]: E0311 18:51:43.965459 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:43 crc kubenswrapper[4842]: I0311 18:51:43.967001 4842 scope.go:117] "RemoveContainer" containerID="d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146" Mar 11 18:51:43 crc kubenswrapper[4842]: E0311 18:51:43.967719 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.009763 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.009813 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.009822 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.009838 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.009849 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:44Z","lastTransitionTime":"2026-03-11T18:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:44 crc kubenswrapper[4842]: E0311 18:51:44.022498 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.026503 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.026623 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.026692 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.026756 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.026819 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:44Z","lastTransitionTime":"2026-03-11T18:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:44 crc kubenswrapper[4842]: E0311 18:51:44.038303 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.041847 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.041896 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.041909 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.041926 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.041938 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:44Z","lastTransitionTime":"2026-03-11T18:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:44 crc kubenswrapper[4842]: E0311 18:51:44.055410 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.059089 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.059127 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.059135 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.059150 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.059160 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:44Z","lastTransitionTime":"2026-03-11T18:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:44 crc kubenswrapper[4842]: E0311 18:51:44.075235 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.078564 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.078658 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.078673 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.078686 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.078695 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:44Z","lastTransitionTime":"2026-03-11T18:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:44 crc kubenswrapper[4842]: E0311 18:51:44.090164 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:44 crc kubenswrapper[4842]: E0311 18:51:44.090294 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.961899 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:44 crc kubenswrapper[4842]: E0311 18:51:44.962032 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:44 crc kubenswrapper[4842]: I0311 18:51:44.983701 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.001029 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:44Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.019687 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.039135 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.058080 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: E0311 18:51:45.063577 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.078216 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.093658 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.113958 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.133901 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.154064 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.170579 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.187725 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.202839 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.216095 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.233792 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.249623 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.267567 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.280686 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:45Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.962008 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.962050 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:45 crc kubenswrapper[4842]: I0311 18:51:45.962026 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:45 crc kubenswrapper[4842]: E0311 18:51:45.962129 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:45 crc kubenswrapper[4842]: E0311 18:51:45.962207 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:45 crc kubenswrapper[4842]: E0311 18:51:45.962315 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:46 crc kubenswrapper[4842]: I0311 18:51:46.085228 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.085429 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.085505 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:52:18.085486391 +0000 UTC m=+183.733182671 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: I0311 18:51:46.893682 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.893942 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:52:50.893921774 +0000 UTC m=+216.541618064 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:51:46 crc kubenswrapper[4842]: I0311 18:51:46.962046 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.962155 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:46 crc kubenswrapper[4842]: I0311 18:51:46.994439 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:46 crc kubenswrapper[4842]: I0311 18:51:46.994517 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:46 crc kubenswrapper[4842]: I0311 18:51:46.994587 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:46 crc kubenswrapper[4842]: I0311 18:51:46.994628 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994650 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994693 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994706 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994706 4842 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994657 4842 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994764 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:52:50.994734988 +0000 UTC m=+216.642431268 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994780 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:52:50.994774849 +0000 UTC m=+216.642471129 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994792 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 18:52:50.994786149 +0000 UTC m=+216.642482429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994789 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994818 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994831 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:46 crc kubenswrapper[4842]: E0311 18:51:46.994902 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:52:50.994881832 +0000 UTC m=+216.642578202 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:51:47 crc kubenswrapper[4842]: I0311 18:51:47.962147 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:47 crc kubenswrapper[4842]: E0311 18:51:47.962317 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:47 crc kubenswrapper[4842]: I0311 18:51:47.962520 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:47 crc kubenswrapper[4842]: I0311 18:51:47.962543 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:47 crc kubenswrapper[4842]: E0311 18:51:47.962739 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:47 crc kubenswrapper[4842]: E0311 18:51:47.962856 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.356975 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/0.log" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.357016 4842 generic.go:334] "Generic (PLEG): container finished" podID="3827ef7b-1abd-4dea-acf3-474eed7b3860" containerID="caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf" exitCode=1 Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.357046 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerDied","Data":"caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf"} Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.357448 4842 scope.go:117] "RemoveContainer" containerID="caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.379326 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.392618 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.407245 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"2026-03-11T18:51:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90\\\\n2026-03-11T18:51:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90 to /host/opt/cni/bin/\\\\n2026-03-11T18:51:03Z [verbose] multus-daemon started\\\\n2026-03-11T18:51:03Z [verbose] Readiness Indicator file check\\\\n2026-03-11T18:51:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.419830 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.429093 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.447860 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.468156 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.485828 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.497314 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.509384 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.519085 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.527593 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.540629 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.551551 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.561699 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.571158 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.581891 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.597383 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:48Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:48 crc kubenswrapper[4842]: I0311 18:51:48.961768 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:48 crc kubenswrapper[4842]: E0311 18:51:48.962145 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.362182 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/0.log" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.362295 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerStarted","Data":"42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a"} Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.376549 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.389924 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.409626 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"2026-03-11T18:51:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90\\\\n2026-03-11T18:51:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90 to /host/opt/cni/bin/\\\\n2026-03-11T18:51:03Z [verbose] multus-daemon started\\\\n2026-03-11T18:51:03Z [verbose] Readiness Indicator file check\\\\n2026-03-11T18:51:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.424432 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.437363 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.465881 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.480119 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.495836 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.509460 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.521540 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.532960 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.543334 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.564000 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.575718 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.586878 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.600015 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.615141 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.636169 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:49Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.961809 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.961857 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:49 crc kubenswrapper[4842]: I0311 18:51:49.961809 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:49 crc kubenswrapper[4842]: E0311 18:51:49.961947 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:49 crc kubenswrapper[4842]: E0311 18:51:49.962075 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:49 crc kubenswrapper[4842]: E0311 18:51:49.962195 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:50 crc kubenswrapper[4842]: E0311 18:51:50.065339 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:50 crc kubenswrapper[4842]: I0311 18:51:50.961254 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:50 crc kubenswrapper[4842]: E0311 18:51:50.961476 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:51 crc kubenswrapper[4842]: I0311 18:51:51.961995 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:51 crc kubenswrapper[4842]: I0311 18:51:51.962060 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:51 crc kubenswrapper[4842]: E0311 18:51:51.962135 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:51 crc kubenswrapper[4842]: I0311 18:51:51.962196 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:51 crc kubenswrapper[4842]: E0311 18:51:51.962367 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:51 crc kubenswrapper[4842]: E0311 18:51:51.962496 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:52 crc kubenswrapper[4842]: I0311 18:51:52.961604 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:52 crc kubenswrapper[4842]: E0311 18:51:52.962058 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:52 crc kubenswrapper[4842]: I0311 18:51:52.978141 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 11 18:51:53 crc kubenswrapper[4842]: I0311 18:51:53.961798 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:53 crc kubenswrapper[4842]: I0311 18:51:53.961902 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:53 crc kubenswrapper[4842]: E0311 18:51:53.961933 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:53 crc kubenswrapper[4842]: I0311 18:51:53.961943 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:53 crc kubenswrapper[4842]: E0311 18:51:53.962044 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:53 crc kubenswrapper[4842]: E0311 18:51:53.962192 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.373528 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.373564 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.373607 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.373624 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.373642 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:54Z","lastTransitionTime":"2026-03-11T18:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:54 crc kubenswrapper[4842]: E0311 18:51:54.385263 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.388685 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.388717 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.388725 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.388739 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.388748 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:54Z","lastTransitionTime":"2026-03-11T18:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:54 crc kubenswrapper[4842]: E0311 18:51:54.398371 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.401261 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.401326 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.401340 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.401354 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.401363 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:54Z","lastTransitionTime":"2026-03-11T18:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:54 crc kubenswrapper[4842]: E0311 18:51:54.411135 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.413936 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.413960 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.413967 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.413979 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.413988 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:54Z","lastTransitionTime":"2026-03-11T18:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:54 crc kubenswrapper[4842]: E0311 18:51:54.423755 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.426466 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.426489 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.426500 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.426514 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.426524 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:51:54Z","lastTransitionTime":"2026-03-11T18:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:51:54 crc kubenswrapper[4842]: E0311 18:51:54.435910 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16dedd3d-ff19-42b0-bfef-c82bb1fa68db\\\",\\\"systemUUID\\\":\\\"4a5eeb05-4676-462d-b71e-ee04d871eea1\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:54 crc kubenswrapper[4842]: E0311 18:51:54.436018 4842 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.961337 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:54 crc kubenswrapper[4842]: E0311 18:51:54.961481 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:54 crc kubenswrapper[4842]: I0311 18:51:54.992825 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:54Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.006421 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.019730 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.031988 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.043907 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"2026-03-11T18:51:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90\\\\n2026-03-11T18:51:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90 to /host/opt/cni/bin/\\\\n2026-03-11T18:51:03Z [verbose] multus-daemon started\\\\n2026-03-11T18:51:03Z [verbose] Readiness Indicator file check\\\\n2026-03-11T18:51:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.055063 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.065092 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: E0311 18:51:55.065766 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.080236 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8302d63-7ad0-4281-9be8-7111c1d59c11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9aa07a66717a26f4250e54999ff868cd607095da1b1cd270e93f8f06abc9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 18:49:45.409427 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 18:49:45.410661 1 observer_polling.go:159] Starting file observer\\\\nI0311 18:49:45.411784 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 18:49:45.412867 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 18:50:14.971705 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 18:50:14.971835 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26431d830e73b55e120456a559859496fc84ac10406cf27322cb1a98e8b7b56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac00bdbaf4507366bedd6e12748fa4e54c702e1cc9de8e0f712b55b486421f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.092418 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.104665 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.116444 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.126010 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.138240 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.150491 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.160894 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.177194 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.192342 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.210534 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.223576 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:55Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.961759 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.961848 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:55 crc kubenswrapper[4842]: E0311 18:51:55.961911 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:55 crc kubenswrapper[4842]: I0311 18:51:55.961870 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:55 crc kubenswrapper[4842]: E0311 18:51:55.962020 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:55 crc kubenswrapper[4842]: E0311 18:51:55.962097 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:56 crc kubenswrapper[4842]: I0311 18:51:56.961239 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:56 crc kubenswrapper[4842]: E0311 18:51:56.961413 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:56 crc kubenswrapper[4842]: I0311 18:51:56.963093 4842 scope.go:117] "RemoveContainer" containerID="d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.389533 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/2.log" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.392331 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1"} Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.393305 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.411100 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.430470 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.441385 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.454400 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.467649 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"2026-03-11T18:51:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90\\\\n2026-03-11T18:51:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90 to /host/opt/cni/bin/\\\\n2026-03-11T18:51:03Z [verbose] multus-daemon started\\\\n2026-03-11T18:51:03Z [verbose] Readiness Indicator file check\\\\n2026-03-11T18:51:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.479434 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.490842 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.510896 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.526241 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.545162 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.563408 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.573553 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.589384 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8302d63-7ad0-4281-9be8-7111c1d59c11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9aa07a66717a26f4250e54999ff868cd607095da1b1cd270e93f8f06abc9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 18:49:45.409427 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 18:49:45.410661 1 observer_polling.go:159] Starting file observer\\\\nI0311 18:49:45.411784 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 18:49:45.412867 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 18:50:14.971705 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 18:50:14.971835 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26431d830e73b55e120456a559859496fc84ac10406cf27322cb1a98e8b7b56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac00bdbaf4507366bedd6e12748fa4e54c702e1cc9de8e0f712b55b486421f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.602020 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.613676 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.627880 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.641658 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.659559 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.674249 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.961461 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.961649 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:57 crc kubenswrapper[4842]: E0311 18:51:57.961838 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:57 crc kubenswrapper[4842]: I0311 18:51:57.961921 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:57 crc kubenswrapper[4842]: E0311 18:51:57.962144 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:51:57 crc kubenswrapper[4842]: E0311 18:51:57.962401 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.399007 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/3.log" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.399690 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/2.log" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.403330 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1" exitCode=1 Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.403418 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1"} Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.403481 4842 scope.go:117] "RemoveContainer" containerID="d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.404277 4842 scope.go:117] "RemoveContainer" containerID="2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1" Mar 11 18:51:58 crc kubenswrapper[4842]: E0311 18:51:58.404531 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.421608 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.439557 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.455263 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.475238 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.496804 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.528197 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34924f7268b229b34b1d7d6a7c4bd5f12e75fc5b1de8c50e2741bfcae84b146\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:27Z\\\",\\\"message\\\":\\\"plate:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 18:51:27.858200 7160 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:27Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:27.858168 7160 obj_retry.go:409] Going to retry *v1.Pod resource setup for 18 objects: [openshift-dns/node-resolver-8lrw5 openshift-multus/network-metrics-daemon-8vd7m openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-image-registry/node-c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:57Z\\\",\\\"message\\\":\\\":[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 18:51:57.790801 7483 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF0311 18:51:57.790803 7483 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:57.790812 7483 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0311 18:51:57.790823 7483 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0311 18:51:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.542317 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.557227 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.585698 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.603022 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.617901 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.635935 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.649814 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"2026-03-11T18:51:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90\\\\n2026-03-11T18:51:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90 to /host/opt/cni/bin/\\\\n2026-03-11T18:51:03Z [verbose] multus-daemon started\\\\n2026-03-11T18:51:03Z [verbose] Readiness Indicator file check\\\\n2026-03-11T18:51:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.662360 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.677930 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8302d63-7ad0-4281-9be8-7111c1d59c11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9aa07a66717a26f4250e54999ff868cd607095da1b1cd270e93f8f06abc9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 18:49:45.409427 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 18:49:45.410661 1 observer_polling.go:159] Starting file observer\\\\nI0311 18:49:45.411784 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 18:49:45.412867 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 18:50:14.971705 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 18:50:14.971835 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26431d830e73b55e120456a559859496fc84ac10406cf27322cb1a98e8b7b56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac00bdbaf4507366bedd6e12748fa4e54c702e1cc9de8e0f712b55b486421f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.694478 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.727235 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.747576 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.762851 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:58Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:58 crc kubenswrapper[4842]: I0311 18:51:58.961587 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:51:58 crc kubenswrapper[4842]: E0311 18:51:58.961723 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.408394 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/3.log" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.412929 4842 scope.go:117] "RemoveContainer" containerID="2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1" Mar 11 18:51:59 crc kubenswrapper[4842]: E0311 18:51:59.413092 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.424580 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4681ce2-058d-4b87-8359-18bd40d4ed2b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ba54613b5d0895333eaf071bee0d4efb638548caa0c40f6b819d9cf7be7052c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d9f330bb0d4d29029bf1e0b825e3d3c11d1b8a7cdc292f88ccecf0dc6f9ee59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf34e68ea1fbf4c4c1414bd0c4cc44489b01230852be8bfc048c5b1ae63951a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eece50e36c6a635a1e11b45c3e040d1b68ff5a65bcbbff3c42b75effbddae489\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.439067 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.452842 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.462798 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nrlmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b857b87a-cc03-4a79-8042-f3a7cf84f8c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02f83bbb2c77eed3026c67f231fb4edf2606e9cc060ce9a8e146606ae9c623b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v9qpr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nrlmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.473587 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8302d63-7ad0-4281-9be8-7111c1d59c11\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9aa07a66717a26f4250e54999ff868cd607095da1b1cd270e93f8f06abc9a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62cb7d84db8cd95265e1535d11c93629d289903b993449521aaf6d9c0afcd049\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 18:49:45.409427 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 18:49:45.410661 1 observer_polling.go:159] Starting file observer\\\\nI0311 18:49:45.411784 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 18:49:45.412867 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 18:50:14.971705 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 18:50:14.971835 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:45Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26431d830e73b55e120456a559859496fc84ac10406cf27322cb1a98e8b7b56\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ac00bdbaf4507366bedd6e12748fa4e54c702e1cc9de8e0f712b55b486421f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.486920 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55897a5552e92682d1014169725792daead44f2a02db2303f8583fd44c00835\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.497836 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8lrw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51f8b6f6-1b94-408b-ad7f-989d62fa1ba5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e8b9c7202da4f8d96b5412f80b0fb0d79e9f1e5126e45f08215d3c77c93a5a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwcxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8lrw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.511850 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9zrff" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc1cbff7-1f6e-4717-91f1-02477203145c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d19698de97494d5577a1e6659ef2e1a1f8df075e3f2c91cbc7056932d6788733\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b94cbea04f574f99f29daefd0e341e7b22cbaf88d9784097aff85cee99ca051\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c1a24dc9ba60514a5ddfe9e68cfd92316512e8347722c72ae84b4e9616a21c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0703f6d6239bb450784119c542acb8148deb83bc5af2126a3a7f2890ccaf3618\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://165671f1f0b6a7a9d8bb375ac7042cc080459d1837650e0abe9832567648586d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62adb164e0f8b459886e7fbd7ade73b543c1f266eefaf0fdc89a8aaff3ac4860\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://681e1b370c2178444b25f0d3b9e1420c16a5ebd6fa662ca901dc59f72f54e758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qktb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9zrff\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.521911 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35cd2b31-35a0-490b-b378-78f439340260\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34df3d95c2c772edfa8a59ca382e5193c0a2b1637bea426f8c28e4dbf900486a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b541db7f3ace711b169cb9430a7940e00d140602716a6235c807cb8b8c58db2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.538584 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c32da15-9b98-45c1-be42-d7d0e89428c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:57Z\\\",\\\"message\\\":\\\":[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0311 18:51:57.790801 7483 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nF0311 18:51:57.790803 7483 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:57Z is after 2025-08-24T17:21:41Z]\\\\nI0311 18:51:57.790812 7483 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0311 18:51:57.790823 7483 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0311 18:51:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-685hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xsn92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.551261 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c99a09c4-b942-4f40-abe9-16b91e662d2b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a34b2301a7609a9c163bd9b74145dcda81ce73154dac5e811f1265146797327f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0353c523aa5c11d787b5e51e04a8c48cc877c57c73279ec8965f874b9735012f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bn4kg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5dxdm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.565326 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c66c2480c2a5f1722d835065c392607b6978320acd85a4bae19cd2207a8774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.578776 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21686ceb-e0dd-49aa-9397-dea4bac2e26c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T18:50:14Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 18:50:14.566810 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 18:50:14.566922 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 18:50:14.567560 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-774827561/tls.crt::/tmp/serving-cert-774827561/tls.key\\\\\\\"\\\\nI0311 18:50:14.798605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 18:50:14.802877 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 18:50:14.802911 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 18:50:14.802949 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 18:50:14.802968 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 18:50:14.815780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 18:50:14.815850 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815860 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 18:50:14.815869 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 18:50:14.815882 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0311 18:50:14.815968 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0311 18:50:14.816062 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 18:50:14.816073 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0311 18:50:14.817102 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:50:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.590128 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.600606 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T18:50:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b0fb4e5e1393300a75effe83db1b2e2ea2b94b06f2979910ff1267e623e814d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c08d43db2b31f45a0530c71509ab3ff5ce7a2e3c824879f512cc94ee9e9862\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:50:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.610839 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2hhn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3827ef7b-1abd-4dea-acf3-474eed7b3860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T18:51:48Z\\\",\\\"message\\\":\\\"2026-03-11T18:51:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90\\\\n2026-03-11T18:51:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4435ab0e-1d7d-41db-b34d-fe43d5eaea90 to /host/opt/cni/bin/\\\\n2026-03-11T18:51:03Z [verbose] multus-daemon started\\\\n2026-03-11T18:51:03Z [verbose] Readiness Indicator file check\\\\n2026-03-11T18:51:48Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z597f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2hhn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.619450 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12f22b8b-b227-48b3-b1f1-322dfe40e383\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5fb096063d6872691a0c7c40b4f980d9b222068895fc66905490e14a4720cf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:51:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7ln64\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-csjgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.629698 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7a00900-ec76-49e4-9485-131830a0611e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x246p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8vd7m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.648169 4842 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d884e7eb-c1bc-4eed-ae79-9356331544eb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T18:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab1358e10b91746ef305239ff4af122c96ce5e9dc50e03882ea85c8e01e2b128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60b59ffc9efea32ae4bedf125b887899abda045fe8ab71f38e0cc779e7b273d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0520560bb5b98f8d315928431ac998928bdd5b1cfb329f8449977675e93a5fd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d169e266395df452789b3fb788e40140588f59f1dfdd6c5b7093dc3c40fb11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb071db4c99253d2f85038b71a74ab031ca6994e094d850e44551e50e78b1483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T18:49:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81d377f8651c9f6afbed89e8d44fa6e3161de4c17013d724a2739d0684370740\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b9d0d23d6c0d8a7733c0fc968d0f10e2e9239cc633acde1844387e8d6a4d13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://989cd526481af3c009fa1328fbc0abe4e1504b7631b203c480db8084da5aa326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T18:49:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T18:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T18:49:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T18:51:59Z is after 2025-08-24T17:21:41Z" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.962128 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.962176 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:51:59 crc kubenswrapper[4842]: E0311 18:51:59.962257 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:51:59 crc kubenswrapper[4842]: I0311 18:51:59.962130 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:51:59 crc kubenswrapper[4842]: E0311 18:51:59.962441 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:51:59 crc kubenswrapper[4842]: E0311 18:51:59.962602 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:00 crc kubenswrapper[4842]: E0311 18:52:00.066596 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:00 crc kubenswrapper[4842]: I0311 18:52:00.962435 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:00 crc kubenswrapper[4842]: E0311 18:52:00.962639 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:01 crc kubenswrapper[4842]: I0311 18:52:01.961580 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:01 crc kubenswrapper[4842]: E0311 18:52:01.961707 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:01 crc kubenswrapper[4842]: I0311 18:52:01.961751 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:01 crc kubenswrapper[4842]: E0311 18:52:01.961881 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:01 crc kubenswrapper[4842]: I0311 18:52:01.962187 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:01 crc kubenswrapper[4842]: E0311 18:52:01.962525 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:02 crc kubenswrapper[4842]: I0311 18:52:02.961436 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:02 crc kubenswrapper[4842]: E0311 18:52:02.961604 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:03 crc kubenswrapper[4842]: I0311 18:52:03.961590 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:03 crc kubenswrapper[4842]: I0311 18:52:03.961637 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:03 crc kubenswrapper[4842]: I0311 18:52:03.961609 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:03 crc kubenswrapper[4842]: E0311 18:52:03.961748 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:03 crc kubenswrapper[4842]: E0311 18:52:03.961879 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:03 crc kubenswrapper[4842]: E0311 18:52:03.962184 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.445043 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.445106 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.445142 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.445174 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.445196 4842 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T18:52:04Z","lastTransitionTime":"2026-03-11T18:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.511986 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9"] Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.512647 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.516486 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.516514 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.516750 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.516848 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.553778 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.553581593 podStartE2EDuration="1m16.553581593s" podCreationTimestamp="2026-03-11 18:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.553247953 +0000 UTC m=+170.200944313" watchObservedRunningTime="2026-03-11 18:52:04.553581593 +0000 UTC m=+170.201277913" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.585980 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797e9574-1482-453c-979f-915f86b51c2b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.586287 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/797e9574-1482-453c-979f-915f86b51c2b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.586403 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/797e9574-1482-453c-979f-915f86b51c2b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.586513 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/797e9574-1482-453c-979f-915f86b51c2b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.586629 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/797e9574-1482-453c-979f-915f86b51c2b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.600167 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.600152259 podStartE2EDuration="1m12.600152259s" podCreationTimestamp="2026-03-11 18:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.578637427 +0000 UTC m=+170.226333717" watchObservedRunningTime="2026-03-11 18:52:04.600152259 +0000 UTC m=+170.247848539" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.641336 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2hhn6" podStartSLOduration=106.641315038 podStartE2EDuration="1m46.641315038s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.629064755 +0000 UTC m=+170.276761035" watchObservedRunningTime="2026-03-11 18:52:04.641315038 +0000 UTC m=+170.289011318" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.641479 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podStartSLOduration=106.641473153 podStartE2EDuration="1m46.641473153s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.641094632 +0000 UTC m=+170.288790922" watchObservedRunningTime="2026-03-11 18:52:04.641473153 +0000 UTC m=+170.289169443" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.668149 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=12.668127793 podStartE2EDuration="12.668127793s" podCreationTimestamp="2026-03-11 18:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.667821165 +0000 UTC m=+170.315517445" watchObservedRunningTime="2026-03-11 18:52:04.668127793 +0000 UTC m=+170.315824073" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.687765 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/797e9574-1482-453c-979f-915f86b51c2b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.687875 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797e9574-1482-453c-979f-915f86b51c2b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.687896 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/797e9574-1482-453c-979f-915f86b51c2b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.687920 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/797e9574-1482-453c-979f-915f86b51c2b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.687939 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/797e9574-1482-453c-979f-915f86b51c2b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.688010 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/797e9574-1482-453c-979f-915f86b51c2b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.688056 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/797e9574-1482-453c-979f-915f86b51c2b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.689044 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/797e9574-1482-453c-979f-915f86b51c2b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.696193 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/797e9574-1482-453c-979f-915f86b51c2b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.701463 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.701439786 podStartE2EDuration="39.701439786s" podCreationTimestamp="2026-03-11 18:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.682578041 +0000 UTC m=+170.330274331" watchObservedRunningTime="2026-03-11 18:52:04.701439786 +0000 UTC m=+170.349136066" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.707359 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/797e9574-1482-453c-979f-915f86b51c2b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-blwk9\" (UID: \"797e9574-1482-453c-979f-915f86b51c2b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.728916 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nrlmz" podStartSLOduration=106.728889559 podStartE2EDuration="1m46.728889559s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.728380665 +0000 UTC m=+170.376076945" watchObservedRunningTime="2026-03-11 18:52:04.728889559 +0000 UTC m=+170.376585839" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.759529 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=76.759501834 podStartE2EDuration="1m16.759501834s" podCreationTimestamp="2026-03-11 18:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.753328496 +0000 UTC m=+170.401024776" watchObservedRunningTime="2026-03-11 18:52:04.759501834 +0000 UTC m=+170.407198114" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.778030 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8lrw5" podStartSLOduration=106.777980498 podStartE2EDuration="1m46.777980498s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.777777682 +0000 UTC m=+170.425473962" watchObservedRunningTime="2026-03-11 18:52:04.777980498 +0000 UTC m=+170.425676778" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.798910 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9zrff" podStartSLOduration=106.798888842 podStartE2EDuration="1m46.798888842s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.798065449 +0000 UTC m=+170.445761749" watchObservedRunningTime="2026-03-11 18:52:04.798888842 +0000 UTC m=+170.446585122" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.834038 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" Mar 11 18:52:04 crc kubenswrapper[4842]: I0311 18:52:04.961845 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:04 crc kubenswrapper[4842]: E0311 18:52:04.962622 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:05 crc kubenswrapper[4842]: E0311 18:52:05.067110 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.202740 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.211669 4842 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.439220 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" event={"ID":"797e9574-1482-453c-979f-915f86b51c2b","Type":"ContainerStarted","Data":"99f46639f90e775cac99e6856dee53fd17e30704b9ff80d00d40b9c1be83f191"} Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.439355 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" event={"ID":"797e9574-1482-453c-979f-915f86b51c2b","Type":"ContainerStarted","Data":"798f5089f3589afb662fa8b9934ef53c8729bb83ca8b0d69501934a1ecdeb45a"} Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.463448 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5dxdm" podStartSLOduration=106.463428417 podStartE2EDuration="1m46.463428417s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:04.856675262 +0000 UTC m=+170.504371542" watchObservedRunningTime="2026-03-11 18:52:05.463428417 +0000 UTC m=+171.111124707" Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.463566 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-blwk9" podStartSLOduration=107.46356177 podStartE2EDuration="1m47.46356177s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:05.462908392 +0000 UTC m=+171.110604692" watchObservedRunningTime="2026-03-11 18:52:05.46356177 +0000 UTC m=+171.111258070" Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.962306 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.962344 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:05 crc kubenswrapper[4842]: E0311 18:52:05.962603 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:05 crc kubenswrapper[4842]: I0311 18:52:05.962986 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:05 crc kubenswrapper[4842]: E0311 18:52:05.963126 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:05 crc kubenswrapper[4842]: E0311 18:52:05.963432 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:06 crc kubenswrapper[4842]: I0311 18:52:06.961961 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:06 crc kubenswrapper[4842]: E0311 18:52:06.962717 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:07 crc kubenswrapper[4842]: I0311 18:52:07.961706 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:07 crc kubenswrapper[4842]: I0311 18:52:07.961764 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:07 crc kubenswrapper[4842]: I0311 18:52:07.961806 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:07 crc kubenswrapper[4842]: E0311 18:52:07.962345 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:07 crc kubenswrapper[4842]: E0311 18:52:07.962551 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:07 crc kubenswrapper[4842]: E0311 18:52:07.962792 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:08 crc kubenswrapper[4842]: I0311 18:52:08.961917 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:08 crc kubenswrapper[4842]: E0311 18:52:08.962069 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:09 crc kubenswrapper[4842]: I0311 18:52:09.961701 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:09 crc kubenswrapper[4842]: I0311 18:52:09.961762 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:09 crc kubenswrapper[4842]: I0311 18:52:09.961701 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:09 crc kubenswrapper[4842]: E0311 18:52:09.961843 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:09 crc kubenswrapper[4842]: E0311 18:52:09.962063 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:09 crc kubenswrapper[4842]: E0311 18:52:09.962193 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:10 crc kubenswrapper[4842]: E0311 18:52:10.068960 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:10 crc kubenswrapper[4842]: I0311 18:52:10.962163 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:10 crc kubenswrapper[4842]: E0311 18:52:10.962498 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:11 crc kubenswrapper[4842]: I0311 18:52:11.961507 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:11 crc kubenswrapper[4842]: I0311 18:52:11.961660 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:11 crc kubenswrapper[4842]: E0311 18:52:11.961735 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:11 crc kubenswrapper[4842]: I0311 18:52:11.961660 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:11 crc kubenswrapper[4842]: E0311 18:52:11.961922 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:11 crc kubenswrapper[4842]: E0311 18:52:11.961989 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:12 crc kubenswrapper[4842]: I0311 18:52:12.961598 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:12 crc kubenswrapper[4842]: E0311 18:52:12.961726 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:13 crc kubenswrapper[4842]: I0311 18:52:13.961133 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:13 crc kubenswrapper[4842]: I0311 18:52:13.961221 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:13 crc kubenswrapper[4842]: E0311 18:52:13.961315 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:13 crc kubenswrapper[4842]: I0311 18:52:13.961133 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:13 crc kubenswrapper[4842]: E0311 18:52:13.961432 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:13 crc kubenswrapper[4842]: E0311 18:52:13.961695 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:13 crc kubenswrapper[4842]: I0311 18:52:13.963511 4842 scope.go:117] "RemoveContainer" containerID="2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1" Mar 11 18:52:13 crc kubenswrapper[4842]: E0311 18:52:13.963774 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:52:14 crc kubenswrapper[4842]: I0311 18:52:14.961541 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:14 crc kubenswrapper[4842]: E0311 18:52:14.963951 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:15 crc kubenswrapper[4842]: E0311 18:52:15.069664 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:15 crc kubenswrapper[4842]: I0311 18:52:15.961962 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:15 crc kubenswrapper[4842]: I0311 18:52:15.962005 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:15 crc kubenswrapper[4842]: I0311 18:52:15.962000 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:15 crc kubenswrapper[4842]: E0311 18:52:15.962196 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:15 crc kubenswrapper[4842]: E0311 18:52:15.962398 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:15 crc kubenswrapper[4842]: E0311 18:52:15.962552 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:16 crc kubenswrapper[4842]: I0311 18:52:16.962248 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:16 crc kubenswrapper[4842]: E0311 18:52:16.963045 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:17 crc kubenswrapper[4842]: I0311 18:52:17.961995 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:17 crc kubenswrapper[4842]: I0311 18:52:17.962135 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:17 crc kubenswrapper[4842]: E0311 18:52:17.962153 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:17 crc kubenswrapper[4842]: I0311 18:52:17.962343 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:17 crc kubenswrapper[4842]: E0311 18:52:17.962524 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:17 crc kubenswrapper[4842]: E0311 18:52:17.963669 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:18 crc kubenswrapper[4842]: I0311 18:52:18.132492 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:18 crc kubenswrapper[4842]: E0311 18:52:18.132646 4842 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:52:18 crc kubenswrapper[4842]: E0311 18:52:18.132691 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs podName:a7a00900-ec76-49e4-9485-131830a0611e nodeName:}" failed. No retries permitted until 2026-03-11 18:53:22.132675919 +0000 UTC m=+247.780372199 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs") pod "network-metrics-daemon-8vd7m" (UID: "a7a00900-ec76-49e4-9485-131830a0611e") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 18:52:18 crc kubenswrapper[4842]: I0311 18:52:18.961801 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:18 crc kubenswrapper[4842]: E0311 18:52:18.961991 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:19 crc kubenswrapper[4842]: I0311 18:52:19.961492 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:19 crc kubenswrapper[4842]: I0311 18:52:19.961561 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:19 crc kubenswrapper[4842]: E0311 18:52:19.961633 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:19 crc kubenswrapper[4842]: E0311 18:52:19.961799 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:19 crc kubenswrapper[4842]: I0311 18:52:19.961823 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:19 crc kubenswrapper[4842]: E0311 18:52:19.961876 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:20 crc kubenswrapper[4842]: E0311 18:52:20.071622 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:20 crc kubenswrapper[4842]: I0311 18:52:20.961817 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:20 crc kubenswrapper[4842]: E0311 18:52:20.962031 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:21 crc kubenswrapper[4842]: I0311 18:52:21.961940 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:21 crc kubenswrapper[4842]: I0311 18:52:21.962053 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:21 crc kubenswrapper[4842]: E0311 18:52:21.962121 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:21 crc kubenswrapper[4842]: E0311 18:52:21.962230 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:21 crc kubenswrapper[4842]: I0311 18:52:21.962583 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:21 crc kubenswrapper[4842]: E0311 18:52:21.962800 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:22 crc kubenswrapper[4842]: I0311 18:52:22.962463 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:22 crc kubenswrapper[4842]: E0311 18:52:22.967657 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:23 crc kubenswrapper[4842]: I0311 18:52:23.961421 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:23 crc kubenswrapper[4842]: E0311 18:52:23.961816 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:23 crc kubenswrapper[4842]: I0311 18:52:23.961561 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:23 crc kubenswrapper[4842]: I0311 18:52:23.961518 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:23 crc kubenswrapper[4842]: E0311 18:52:23.961894 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:23 crc kubenswrapper[4842]: E0311 18:52:23.962067 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:24 crc kubenswrapper[4842]: I0311 18:52:24.961523 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:24 crc kubenswrapper[4842]: E0311 18:52:24.964746 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:25 crc kubenswrapper[4842]: E0311 18:52:25.072181 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:25 crc kubenswrapper[4842]: I0311 18:52:25.961700 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:25 crc kubenswrapper[4842]: I0311 18:52:25.961756 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:25 crc kubenswrapper[4842]: I0311 18:52:25.961790 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:25 crc kubenswrapper[4842]: E0311 18:52:25.961825 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:25 crc kubenswrapper[4842]: E0311 18:52:25.961891 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:25 crc kubenswrapper[4842]: E0311 18:52:25.961980 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:26 crc kubenswrapper[4842]: I0311 18:52:26.961639 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:26 crc kubenswrapper[4842]: E0311 18:52:26.962109 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:27 crc kubenswrapper[4842]: I0311 18:52:27.961453 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:27 crc kubenswrapper[4842]: I0311 18:52:27.961580 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:27 crc kubenswrapper[4842]: E0311 18:52:27.961656 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:27 crc kubenswrapper[4842]: I0311 18:52:27.961728 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:27 crc kubenswrapper[4842]: E0311 18:52:27.961842 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:27 crc kubenswrapper[4842]: E0311 18:52:27.961904 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:27 crc kubenswrapper[4842]: I0311 18:52:27.963348 4842 scope.go:117] "RemoveContainer" containerID="2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1" Mar 11 18:52:27 crc kubenswrapper[4842]: E0311 18:52:27.963673 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xsn92_openshift-ovn-kubernetes(5c32da15-9b98-45c1-be42-d7d0e89428c5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" Mar 11 18:52:28 crc kubenswrapper[4842]: I0311 18:52:28.961705 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:28 crc kubenswrapper[4842]: E0311 18:52:28.961888 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:29 crc kubenswrapper[4842]: I0311 18:52:29.961475 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:29 crc kubenswrapper[4842]: I0311 18:52:29.961518 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:29 crc kubenswrapper[4842]: E0311 18:52:29.961608 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:29 crc kubenswrapper[4842]: E0311 18:52:29.961828 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:29 crc kubenswrapper[4842]: I0311 18:52:29.962398 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:29 crc kubenswrapper[4842]: E0311 18:52:29.962641 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:30 crc kubenswrapper[4842]: E0311 18:52:30.072921 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:30 crc kubenswrapper[4842]: I0311 18:52:30.961887 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:30 crc kubenswrapper[4842]: E0311 18:52:30.962168 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:31 crc kubenswrapper[4842]: I0311 18:52:31.961725 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:31 crc kubenswrapper[4842]: E0311 18:52:31.961880 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:31 crc kubenswrapper[4842]: I0311 18:52:31.961743 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:31 crc kubenswrapper[4842]: I0311 18:52:31.961724 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:31 crc kubenswrapper[4842]: E0311 18:52:31.961962 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:31 crc kubenswrapper[4842]: E0311 18:52:31.962120 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:32 crc kubenswrapper[4842]: I0311 18:52:32.961212 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:32 crc kubenswrapper[4842]: E0311 18:52:32.961364 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:33 crc kubenswrapper[4842]: I0311 18:52:33.961951 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:33 crc kubenswrapper[4842]: I0311 18:52:33.962024 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:33 crc kubenswrapper[4842]: E0311 18:52:33.962106 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:33 crc kubenswrapper[4842]: I0311 18:52:33.962313 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:33 crc kubenswrapper[4842]: E0311 18:52:33.962394 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:33 crc kubenswrapper[4842]: E0311 18:52:33.962522 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:34 crc kubenswrapper[4842]: I0311 18:52:34.544578 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/1.log" Mar 11 18:52:34 crc kubenswrapper[4842]: I0311 18:52:34.545088 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/0.log" Mar 11 18:52:34 crc kubenswrapper[4842]: I0311 18:52:34.545123 4842 generic.go:334] "Generic (PLEG): container finished" podID="3827ef7b-1abd-4dea-acf3-474eed7b3860" containerID="42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a" exitCode=1 Mar 11 18:52:34 crc kubenswrapper[4842]: I0311 18:52:34.545146 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerDied","Data":"42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a"} Mar 11 18:52:34 crc kubenswrapper[4842]: I0311 18:52:34.545173 4842 scope.go:117] "RemoveContainer" containerID="caf1bf69c626045a7da81018e9b08663199478c12de198ebb1a9aa95dfe45bbf" Mar 11 18:52:34 crc kubenswrapper[4842]: I0311 18:52:34.545710 4842 scope.go:117] "RemoveContainer" containerID="42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a" Mar 11 18:52:34 crc kubenswrapper[4842]: E0311 18:52:34.545897 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-2hhn6_openshift-multus(3827ef7b-1abd-4dea-acf3-474eed7b3860)\"" pod="openshift-multus/multus-2hhn6" podUID="3827ef7b-1abd-4dea-acf3-474eed7b3860" Mar 11 18:52:34 crc kubenswrapper[4842]: I0311 18:52:34.961255 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:34 crc kubenswrapper[4842]: E0311 18:52:34.963509 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:35 crc kubenswrapper[4842]: E0311 18:52:35.086939 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:35 crc kubenswrapper[4842]: I0311 18:52:35.551869 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/1.log" Mar 11 18:52:35 crc kubenswrapper[4842]: I0311 18:52:35.961077 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:35 crc kubenswrapper[4842]: I0311 18:52:35.961115 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:35 crc kubenswrapper[4842]: E0311 18:52:35.961204 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:35 crc kubenswrapper[4842]: E0311 18:52:35.961317 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:35 crc kubenswrapper[4842]: I0311 18:52:35.961493 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:35 crc kubenswrapper[4842]: E0311 18:52:35.961640 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:36 crc kubenswrapper[4842]: I0311 18:52:36.961902 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:36 crc kubenswrapper[4842]: E0311 18:52:36.962027 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:37 crc kubenswrapper[4842]: I0311 18:52:37.961835 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:37 crc kubenswrapper[4842]: I0311 18:52:37.961891 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:37 crc kubenswrapper[4842]: I0311 18:52:37.961895 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:37 crc kubenswrapper[4842]: E0311 18:52:37.962086 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:37 crc kubenswrapper[4842]: E0311 18:52:37.962157 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:37 crc kubenswrapper[4842]: E0311 18:52:37.962224 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:38 crc kubenswrapper[4842]: I0311 18:52:38.961745 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:38 crc kubenswrapper[4842]: E0311 18:52:38.961896 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:39 crc kubenswrapper[4842]: I0311 18:52:39.961132 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:39 crc kubenswrapper[4842]: I0311 18:52:39.961221 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:39 crc kubenswrapper[4842]: E0311 18:52:39.961336 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:39 crc kubenswrapper[4842]: I0311 18:52:39.961210 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:39 crc kubenswrapper[4842]: E0311 18:52:39.961543 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:39 crc kubenswrapper[4842]: E0311 18:52:39.961811 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:40 crc kubenswrapper[4842]: E0311 18:52:40.088669 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:40 crc kubenswrapper[4842]: I0311 18:52:40.961447 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:40 crc kubenswrapper[4842]: E0311 18:52:40.961642 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:40 crc kubenswrapper[4842]: I0311 18:52:40.962747 4842 scope.go:117] "RemoveContainer" containerID="2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1" Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.576618 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/3.log" Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.579632 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerStarted","Data":"915fc3a7862038ff15ddd6b82f3fcb6a5af04baf8d5a5f62544049b2607b9f18"} Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.580247 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.617552 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podStartSLOduration=143.617534389 podStartE2EDuration="2m23.617534389s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:41.615853691 +0000 UTC m=+207.263550001" watchObservedRunningTime="2026-03-11 18:52:41.617534389 +0000 UTC m=+207.265230669" Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.872385 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8vd7m"] Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.872471 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:41 crc kubenswrapper[4842]: E0311 18:52:41.872552 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.961992 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:41 crc kubenswrapper[4842]: I0311 18:52:41.962048 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:41 crc kubenswrapper[4842]: E0311 18:52:41.962139 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:41 crc kubenswrapper[4842]: E0311 18:52:41.962186 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:42 crc kubenswrapper[4842]: I0311 18:52:42.962026 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:42 crc kubenswrapper[4842]: E0311 18:52:42.962204 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:43 crc kubenswrapper[4842]: I0311 18:52:43.961595 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:43 crc kubenswrapper[4842]: E0311 18:52:43.961728 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:43 crc kubenswrapper[4842]: I0311 18:52:43.961923 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:43 crc kubenswrapper[4842]: E0311 18:52:43.961986 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:43 crc kubenswrapper[4842]: I0311 18:52:43.962099 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:43 crc kubenswrapper[4842]: E0311 18:52:43.962147 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:44 crc kubenswrapper[4842]: I0311 18:52:44.961392 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:44 crc kubenswrapper[4842]: E0311 18:52:44.962699 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:45 crc kubenswrapper[4842]: E0311 18:52:45.089470 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 18:52:45 crc kubenswrapper[4842]: I0311 18:52:45.962105 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:45 crc kubenswrapper[4842]: I0311 18:52:45.962210 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:45 crc kubenswrapper[4842]: I0311 18:52:45.962105 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:45 crc kubenswrapper[4842]: E0311 18:52:45.962232 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:45 crc kubenswrapper[4842]: E0311 18:52:45.962389 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:45 crc kubenswrapper[4842]: E0311 18:52:45.962445 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:46 crc kubenswrapper[4842]: I0311 18:52:46.961671 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:46 crc kubenswrapper[4842]: I0311 18:52:46.962184 4842 scope.go:117] "RemoveContainer" containerID="42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a" Mar 11 18:52:46 crc kubenswrapper[4842]: E0311 18:52:46.962205 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:47 crc kubenswrapper[4842]: I0311 18:52:47.600725 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/1.log" Mar 11 18:52:47 crc kubenswrapper[4842]: I0311 18:52:47.600790 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerStarted","Data":"8495aa5c205b0ecc00ef526be4fec937aacfe0fca4b2ed45604286f61edd1612"} Mar 11 18:52:47 crc kubenswrapper[4842]: I0311 18:52:47.961518 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:47 crc kubenswrapper[4842]: I0311 18:52:47.961552 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:47 crc kubenswrapper[4842]: I0311 18:52:47.962010 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:47 crc kubenswrapper[4842]: E0311 18:52:47.962254 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:47 crc kubenswrapper[4842]: E0311 18:52:47.962911 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:47 crc kubenswrapper[4842]: E0311 18:52:47.963112 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:48 crc kubenswrapper[4842]: I0311 18:52:48.962053 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:48 crc kubenswrapper[4842]: E0311 18:52:48.962241 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 18:52:49 crc kubenswrapper[4842]: I0311 18:52:49.961233 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:49 crc kubenswrapper[4842]: E0311 18:52:49.961732 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:52:49 crc kubenswrapper[4842]: I0311 18:52:49.961257 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:49 crc kubenswrapper[4842]: E0311 18:52:49.961799 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 18:52:49 crc kubenswrapper[4842]: I0311 18:52:49.961257 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:49 crc kubenswrapper[4842]: E0311 18:52:49.961853 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8vd7m" podUID="a7a00900-ec76-49e4-9485-131830a0611e" Mar 11 18:52:50 crc kubenswrapper[4842]: I0311 18:52:50.908526 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:50 crc kubenswrapper[4842]: E0311 18:52:50.908691 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:54:52.908670527 +0000 UTC m=+338.556366817 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:50 crc kubenswrapper[4842]: I0311 18:52:50.962475 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:50 crc kubenswrapper[4842]: I0311 18:52:50.965919 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 18:52:50 crc kubenswrapper[4842]: I0311 18:52:50.966416 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.010033 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.010313 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.010375 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.010447 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.010540 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.010582 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.010604 4842 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.010684 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 18:54:53.01066109 +0000 UTC m=+338.658357410 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.011351 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.011501 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.011611 4842 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:52:51 crc kubenswrapper[4842]: E0311 18:52:51.011833 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 18:54:53.011806013 +0000 UTC m=+338.659502393 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.011945 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.018540 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.278697 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.613444 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"24217c9696ea7f61efc1df8eb6997e9d5fa49ea23306ef3ad5f8ee5775339b0d"} Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.613985 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9e1bfce241a29f8fbe5cab8484ee70efb3c44907913a679a5aa49a61a0ff84c7"} Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.961664 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.961733 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.961733 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.964169 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.964584 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.965412 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 18:52:51 crc kubenswrapper[4842]: I0311 18:52:51.966690 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.654185 4842 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.721190 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h2kpt"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.722263 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.722812 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bz2cr"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.723457 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.723752 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mhz6l"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.724339 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mhz6l" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.730508 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.731044 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.731449 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.731977 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.734161 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.735151 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.738869 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.739827 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.739985 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.740213 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.741014 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.742211 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.742816 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.742952 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.746340 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.747630 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751375 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751470 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751557 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751604 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751385 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751489 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752599 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751607 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751629 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751704 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751718 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751740 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751755 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751805 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751828 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751885 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751891 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751904 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751920 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751927 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751952 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751991 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.751978 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752028 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752044 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752082 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754093 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752091 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752086 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752119 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752130 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752140 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752160 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752173 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.752516 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754485 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754640 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754704 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kgd5\" (UniqueName: \"kubernetes.io/projected/ca012f19-1dcd-41c8-8e17-bb98db200573-kube-api-access-4kgd5\") pod \"downloads-7954f5f757-mhz6l\" (UID: \"ca012f19-1dcd-41c8-8e17-bb98db200573\") " pod="openshift-console/downloads-7954f5f757-mhz6l" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754727 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754777 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhkd\" (UniqueName: \"kubernetes.io/projected/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-kube-api-access-nxhkd\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754838 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754847 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754871 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjr4l\" (UniqueName: \"kubernetes.io/projected/5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0-kube-api-access-pjr4l\") pod \"cluster-samples-operator-665b6dd947-dh5kz\" (UID: \"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754902 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-image-import-ca\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754916 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f9m6d"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754943 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dh5kz\" (UID: \"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754967 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhrh\" (UniqueName: \"kubernetes.io/projected/34df260e-28ff-4766-a6ea-5e8df0d34060-kube-api-access-qvhrh\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.754990 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqg2r\" (UniqueName: \"kubernetes.io/projected/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-kube-api-access-dqg2r\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755016 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-etcd-client\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755039 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-encryption-config\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755066 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-etcd-client\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755089 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755112 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/855d410e-6475-4d9c-b523-7fa091254a84-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755139 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-config\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755163 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-audit\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755193 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spjjc\" (UniqueName: \"kubernetes.io/projected/062df717-6373-434b-a199-26182e191332-kube-api-access-spjjc\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755222 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-audit-policies\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755249 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-serving-cert\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755300 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/855d410e-6475-4d9c-b523-7fa091254a84-serving-cert\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755334 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755356 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-encryption-config\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755380 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5fa4203-40f4-4fb7-a79a-b983415cd996-audit-dir\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755457 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755495 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-serving-cert\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755529 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34df260e-28ff-4766-a6ea-5e8df0d34060-serving-cert\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755654 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-etcd-serving-ca\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755706 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-config\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755733 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/062df717-6373-434b-a199-26182e191332-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755772 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-node-pullsecrets\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755842 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755883 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/062df717-6373-434b-a199-26182e191332-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755908 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/062df717-6373-434b-a199-26182e191332-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755934 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85v4x\" (UniqueName: \"kubernetes.io/projected/d5fa4203-40f4-4fb7-a79a-b983415cd996-kube-api-access-85v4x\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755957 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-client-ca\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.755980 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lppqq\" (UniqueName: \"kubernetes.io/projected/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-kube-api-access-lppqq\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.756005 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.756052 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbjhl\" (UniqueName: \"kubernetes.io/projected/855d410e-6475-4d9c-b523-7fa091254a84-kube-api-access-vbjhl\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.756079 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-audit-dir\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.756122 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.763864 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.771883 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.772649 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.772846 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.773723 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.773990 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.774207 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.775173 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.776458 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.776720 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.776950 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tl2vq"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.777045 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.777405 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.777432 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.777519 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6bvx"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.787752 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.789315 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.793891 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.797026 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.797576 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.777584 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.810454 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.814296 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.815376 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.816014 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.816301 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.817085 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.818628 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2flrr"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.819452 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.820200 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821069 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821077 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821213 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821410 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821612 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821619 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821715 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821874 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821974 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.821986 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.827749 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.828508 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-v5z72"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.828792 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xgzxx"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.829224 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.835900 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.835953 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836198 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836211 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.838999 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.841936 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836249 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836351 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836394 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836433 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836468 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836524 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836566 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836604 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836642 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836678 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836722 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836799 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836858 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.836891 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.842989 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.844173 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.844781 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.847146 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.847737 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.848165 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.852844 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.853110 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.854934 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.855573 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.855821 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.855942 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.857681 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.857891 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858315 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858400 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-image-import-ca\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858482 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjr4l\" (UniqueName: \"kubernetes.io/projected/5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0-kube-api-access-pjr4l\") pod \"cluster-samples-operator-665b6dd947-dh5kz\" (UID: \"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858550 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dh5kz\" (UID: \"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858623 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhrh\" (UniqueName: \"kubernetes.io/projected/34df260e-28ff-4766-a6ea-5e8df0d34060-kube-api-access-qvhrh\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858696 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-etcd-client\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858769 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqg2r\" (UniqueName: \"kubernetes.io/projected/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-kube-api-access-dqg2r\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858976 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-encryption-config\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.859096 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-etcd-client\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.859214 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.859375 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-config\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.859516 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-audit\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.859615 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-image-import-ca\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.859629 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/855d410e-6475-4d9c-b523-7fa091254a84-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.858554 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.861460 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.862041 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4glcd"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.862113 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spjjc\" (UniqueName: \"kubernetes.io/projected/062df717-6373-434b-a199-26182e191332-kube-api-access-spjjc\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.862147 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-audit-policies\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.862689 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fmkft"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.863100 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-twwzj"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866172 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-audit\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866235 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866333 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-serving-cert\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866467 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866562 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/855d410e-6475-4d9c-b523-7fa091254a84-serving-cert\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866592 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866614 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-encryption-config\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.866640 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5fa4203-40f4-4fb7-a79a-b983415cd996-audit-dir\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.867471 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/855d410e-6475-4d9c-b523-7fa091254a84-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.868206 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.868348 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-config\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.868799 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-audit-policies\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.868867 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5fa4203-40f4-4fb7-a79a-b983415cd996-audit-dir\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.869041 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.869068 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-serving-cert\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.869127 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34df260e-28ff-4766-a6ea-5e8df0d34060-serving-cert\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.869147 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-etcd-serving-ca\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.873408 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-etcd-serving-ca\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.874051 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-config\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.874174 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.874794 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/062df717-6373-434b-a199-26182e191332-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.877072 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-config\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.889091 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34df260e-28ff-4766-a6ea-5e8df0d34060-serving-cert\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.889172 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-serving-cert\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.889758 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.889769 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-config\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.890122 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.891254 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-node-pullsecrets\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.891314 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.891579 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/062df717-6373-434b-a199-26182e191332-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.891822 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/062df717-6373-434b-a199-26182e191332-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.891952 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85v4x\" (UniqueName: \"kubernetes.io/projected/d5fa4203-40f4-4fb7-a79a-b983415cd996-kube-api-access-85v4x\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.892059 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-client-ca\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.892089 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.895008 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-node-pullsecrets\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.895906 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.892202 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lppqq\" (UniqueName: \"kubernetes.io/projected/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-kube-api-access-lppqq\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.896388 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbjhl\" (UniqueName: \"kubernetes.io/projected/855d410e-6475-4d9c-b523-7fa091254a84-kube-api-access-vbjhl\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.896515 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-audit-dir\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.896650 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.896778 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kgd5\" (UniqueName: \"kubernetes.io/projected/ca012f19-1dcd-41c8-8e17-bb98db200573-kube-api-access-4kgd5\") pod \"downloads-7954f5f757-mhz6l\" (UID: \"ca012f19-1dcd-41c8-8e17-bb98db200573\") " pod="openshift-console/downloads-7954f5f757-mhz6l" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.896816 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.896932 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxhkd\" (UniqueName: \"kubernetes.io/projected/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-kube-api-access-nxhkd\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.897593 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-audit-dir\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.899635 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-etcd-client\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.902836 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d5fa4203-40f4-4fb7-a79a-b983415cd996-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.911116 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.917961 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/062df717-6373-434b-a199-26182e191332-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.923401 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/855d410e-6475-4d9c-b523-7fa091254a84-serving-cert\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.925403 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-dh5kz\" (UID: \"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.926275 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-client-ca\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.926644 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-encryption-config\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.927641 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-etcd-client\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.928240 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.928540 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.929072 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-encryption-config\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.930581 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.930857 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.932014 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5fa4203-40f4-4fb7-a79a-b983415cd996-serving-cert\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.932372 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.932848 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.933313 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.933588 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.933911 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.933940 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.934105 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.935013 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.935926 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.935207 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.935347 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.936932 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.937495 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.937618 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/062df717-6373-434b-a199-26182e191332-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.937853 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.938225 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.938794 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.939556 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.940306 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.940840 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.943856 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.944513 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mhz6l"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.955718 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.967329 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.968096 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.968139 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ntf4r"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.968775 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.969255 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.972612 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h2kpt"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.977016 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.977897 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.985335 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zkds6"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.986586 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.988228 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.993622 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554252-4qdmf"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.994437 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.995213 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.996654 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.996794 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.997404 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998034 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de8073d6-b58a-41f8-a20e-1de8878ee12a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998117 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb28ce8-aad8-4db8-8492-319989f0059b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h8nlz\" (UID: \"6bb28ce8-aad8-4db8-8492-319989f0059b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998185 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-czxnq"] Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998190 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-serving-cert\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998369 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998447 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-dir\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998515 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998598 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998674 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-config\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998753 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-serving-cert\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998830 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-client-ca\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998904 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-srv-cert\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.998973 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999049 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4gkk\" (UniqueName: \"kubernetes.io/projected/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-kube-api-access-h4gkk\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999064 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999127 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999251 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999337 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999417 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v55jb\" (UniqueName: \"kubernetes.io/projected/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-kube-api-access-v55jb\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999489 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-ca\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999560 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-default-certificate\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999625 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-stats-auth\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999692 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf78d4e-beb5-4487-b728-e34230363308-config\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999813 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-service-ca\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999890 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:55 crc kubenswrapper[4842]: I0311 18:52:55.999957 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-trusted-ca-bundle\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000028 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000100 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-auth-proxy-config\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000172 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g55xf\" (UniqueName: \"kubernetes.io/projected/de8073d6-b58a-41f8-a20e-1de8878ee12a-kube-api-access-g55xf\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000242 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-config\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000353 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfz8\" (UniqueName: \"kubernetes.io/projected/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-kube-api-access-mpfz8\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000438 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb77p\" (UniqueName: \"kubernetes.io/projected/28316cb3-4478-424c-bf38-43d5645ee769-kube-api-access-xb77p\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000512 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000597 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rjr\" (UniqueName: \"kubernetes.io/projected/148bd39e-58ee-4a7f-aa9c-8435ab50d862-kube-api-access-58rjr\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000666 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de8073d6-b58a-41f8-a20e-1de8878ee12a-trusted-ca\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000739 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28316cb3-4478-424c-bf38-43d5645ee769-config\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000804 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000888 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdd2\" (UniqueName: \"kubernetes.io/projected/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-kube-api-access-8tdd2\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.000959 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20246915-b29b-42bf-871e-81fcf4b2da46-config\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001029 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-config-volume\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001158 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wb8\" (UniqueName: \"kubernetes.io/projected/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-kube-api-access-g8wb8\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001230 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-profile-collector-cert\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001335 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-config\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001404 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-service-ca-bundle\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001482 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de8073d6-b58a-41f8-a20e-1de8878ee12a-metrics-tls\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001552 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a286812e-873b-4844-a8ee-600ebdf1df1b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001634 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20246915-b29b-42bf-871e-81fcf4b2da46-serving-cert\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001718 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft44f\" (UniqueName: \"kubernetes.io/projected/6639342f-1d7c-4b9f-9836-2df2063e57b5-kube-api-access-ft44f\") pod \"dns-operator-744455d44c-c6bvx\" (UID: \"6639342f-1d7c-4b9f-9836-2df2063e57b5\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001794 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-config\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001864 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-client\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.001939 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mxgm\" (UniqueName: \"kubernetes.io/projected/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-kube-api-access-9mxgm\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002041 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-config\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002114 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4xh\" (UniqueName: \"kubernetes.io/projected/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-kube-api-access-qd4xh\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002180 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf78d4e-beb5-4487-b728-e34230363308-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002256 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-policies\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002351 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zphsc\" (UniqueName: \"kubernetes.io/projected/6bb28ce8-aad8-4db8-8492-319989f0059b-kube-api-access-zphsc\") pod \"package-server-manager-789f6589d5-h8nlz\" (UID: \"6bb28ce8-aad8-4db8-8492-319989f0059b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002451 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002537 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-oauth-serving-cert\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002605 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-oauth-config\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002688 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf78d4e-beb5-4487-b728-e34230363308-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002763 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-config\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002832 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002897 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a286812e-873b-4844-a8ee-600ebdf1df1b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.002975 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6639342f-1d7c-4b9f-9836-2df2063e57b5-metrics-tls\") pod \"dns-operator-744455d44c-c6bvx\" (UID: \"6639342f-1d7c-4b9f-9836-2df2063e57b5\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003054 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sdct\" (UniqueName: \"kubernetes.io/projected/20246915-b29b-42bf-871e-81fcf4b2da46-kube-api-access-9sdct\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003143 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-srv-cert\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003214 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dl8s\" (UniqueName: \"kubernetes.io/projected/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-kube-api-access-4dl8s\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003306 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003657 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003753 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bab77e0c-a6e8-4e8b-a036-695cda94d7db-service-ca-bundle\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003831 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/28316cb3-4478-424c-bf38-43d5645ee769-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.003928 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004004 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20246915-b29b-42bf-871e-81fcf4b2da46-trusted-ca\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004182 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004311 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/28316cb3-4478-424c-bf38-43d5645ee769-images\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004355 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp6lp\" (UniqueName: \"kubernetes.io/projected/bab77e0c-a6e8-4e8b-a036-695cda94d7db-kube-api-access-hp6lp\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004377 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-machine-approver-tls\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004397 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-service-ca\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004415 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-serving-cert\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004439 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a286812e-873b-4844-a8ee-600ebdf1df1b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004469 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-metrics-certs\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004500 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-serving-cert\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004531 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4tnz\" (UniqueName: \"kubernetes.io/projected/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-kube-api-access-b4tnz\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.004596 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-secret-volume\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.006421 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.020426 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f9m6d"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.020749 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.023764 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.024484 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.024558 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bz2cr"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.026615 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.044811 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.047940 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6bvx"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.048067 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.048236 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tl2vq"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.049656 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.050429 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.050634 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.051484 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4glcd"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.052417 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.053609 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.055960 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4pbjx"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.060098 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.060215 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-twwzj"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.060303 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2flrr"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.060474 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.061430 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.064822 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.067476 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.076563 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fmkft"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.078371 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.078948 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.079952 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vlcsj"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.081594 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.081686 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.082381 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.083880 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.083908 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.084508 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.085461 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554252-4qdmf"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.086418 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.087450 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.088467 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pbjx"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.090773 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v5z72"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.091805 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zkds6"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.093192 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.094628 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-czxnq"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.095950 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ntf4r"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.097133 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vlcsj"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.098197 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.099198 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7x6n2"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.099770 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.100460 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7x6n2"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.101674 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fz2xd"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.102527 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.104358 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105354 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-serving-cert\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105384 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105416 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-dir\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105435 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105455 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105471 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-config\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105489 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-serving-cert\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105507 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-client-ca\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105524 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-srv-cert\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105548 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105563 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105586 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4gkk\" (UniqueName: \"kubernetes.io/projected/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-kube-api-access-h4gkk\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105602 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v55jb\" (UniqueName: \"kubernetes.io/projected/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-kube-api-access-v55jb\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105616 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105634 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105650 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-ca\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105668 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf78d4e-beb5-4487-b728-e34230363308-config\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105687 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-default-certificate\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105704 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-stats-auth\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105741 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-service-ca\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105756 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105771 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-trusted-ca-bundle\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105790 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105809 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-auth-proxy-config\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105828 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g55xf\" (UniqueName: \"kubernetes.io/projected/de8073d6-b58a-41f8-a20e-1de8878ee12a-kube-api-access-g55xf\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105849 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-config\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105876 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfz8\" (UniqueName: \"kubernetes.io/projected/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-kube-api-access-mpfz8\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105897 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb77p\" (UniqueName: \"kubernetes.io/projected/28316cb3-4478-424c-bf38-43d5645ee769-kube-api-access-xb77p\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105914 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105966 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rjr\" (UniqueName: \"kubernetes.io/projected/148bd39e-58ee-4a7f-aa9c-8435ab50d862-kube-api-access-58rjr\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105982 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de8073d6-b58a-41f8-a20e-1de8878ee12a-trusted-ca\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.105998 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdd2\" (UniqueName: \"kubernetes.io/projected/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-kube-api-access-8tdd2\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106018 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28316cb3-4478-424c-bf38-43d5645ee769-config\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106037 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106057 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20246915-b29b-42bf-871e-81fcf4b2da46-config\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106107 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-config-volume\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106131 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-service-ca-bundle\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106168 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wb8\" (UniqueName: \"kubernetes.io/projected/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-kube-api-access-g8wb8\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106188 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-profile-collector-cert\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106208 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-config\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106225 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de8073d6-b58a-41f8-a20e-1de8878ee12a-metrics-tls\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106245 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a286812e-873b-4844-a8ee-600ebdf1df1b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106261 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20246915-b29b-42bf-871e-81fcf4b2da46-serving-cert\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106299 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft44f\" (UniqueName: \"kubernetes.io/projected/6639342f-1d7c-4b9f-9836-2df2063e57b5-kube-api-access-ft44f\") pod \"dns-operator-744455d44c-c6bvx\" (UID: \"6639342f-1d7c-4b9f-9836-2df2063e57b5\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106322 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-config\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106339 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-client\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106356 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mxgm\" (UniqueName: \"kubernetes.io/projected/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-kube-api-access-9mxgm\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106375 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-config\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106396 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4xh\" (UniqueName: \"kubernetes.io/projected/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-kube-api-access-qd4xh\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106414 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf78d4e-beb5-4487-b728-e34230363308-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106431 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-policies\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106450 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zphsc\" (UniqueName: \"kubernetes.io/projected/6bb28ce8-aad8-4db8-8492-319989f0059b-kube-api-access-zphsc\") pod \"package-server-manager-789f6589d5-h8nlz\" (UID: \"6bb28ce8-aad8-4db8-8492-319989f0059b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106468 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106488 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-oauth-serving-cert\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106494 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-config\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106516 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106813 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-dir\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107164 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-client-ca\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107181 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.106506 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-oauth-config\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107495 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107508 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf78d4e-beb5-4487-b728-e34230363308-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107555 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-config\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107594 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107627 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a286812e-873b-4844-a8ee-600ebdf1df1b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107653 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6639342f-1d7c-4b9f-9836-2df2063e57b5-metrics-tls\") pod \"dns-operator-744455d44c-c6bvx\" (UID: \"6639342f-1d7c-4b9f-9836-2df2063e57b5\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107681 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sdct\" (UniqueName: \"kubernetes.io/projected/20246915-b29b-42bf-871e-81fcf4b2da46-kube-api-access-9sdct\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107715 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-srv-cert\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107740 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dl8s\" (UniqueName: \"kubernetes.io/projected/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-kube-api-access-4dl8s\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107768 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107795 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107824 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bab77e0c-a6e8-4e8b-a036-695cda94d7db-service-ca-bundle\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.107851 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/28316cb3-4478-424c-bf38-43d5645ee769-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.108013 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-config\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.108368 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-oauth-serving-cert\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.108679 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-config\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.108766 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.108917 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bab77e0c-a6e8-4e8b-a036-695cda94d7db-service-ca-bundle\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.108950 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.108992 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/28316cb3-4478-424c-bf38-43d5645ee769-images\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109043 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20246915-b29b-42bf-871e-81fcf4b2da46-trusted-ca\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109070 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109075 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20246915-b29b-42bf-871e-81fcf4b2da46-config\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109102 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp6lp\" (UniqueName: \"kubernetes.io/projected/bab77e0c-a6e8-4e8b-a036-695cda94d7db-kube-api-access-hp6lp\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109126 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-machine-approver-tls\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109150 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-serving-cert\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109172 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-service-ca\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109192 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-serving-cert\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109216 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a286812e-873b-4844-a8ee-600ebdf1df1b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109003 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-trusted-ca-bundle\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109308 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-policies\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109404 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-metrics-certs\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109433 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4tnz\" (UniqueName: \"kubernetes.io/projected/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-kube-api-access-b4tnz\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109455 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-secret-volume\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109478 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb28ce8-aad8-4db8-8492-319989f0059b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h8nlz\" (UID: \"6bb28ce8-aad8-4db8-8492-319989f0059b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109510 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de8073d6-b58a-41f8-a20e-1de8878ee12a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.109921 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-service-ca\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.111514 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.111541 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.111627 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-client\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.111898 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20246915-b29b-42bf-871e-81fcf4b2da46-trusted-ca\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.112225 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-stats-auth\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.112843 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-ca\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.113528 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-etcd-service-ca\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.113663 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-auth-proxy-config\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.113722 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-profile-collector-cert\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.113833 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-config\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.114823 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-metrics-certs\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.115618 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-serving-cert\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.115733 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-secret-volume\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.116199 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.116441 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.116730 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.116825 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf78d4e-beb5-4487-b728-e34230363308-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.116922 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.117045 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-serving-cert\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.117259 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-oauth-config\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.117412 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6639342f-1d7c-4b9f-9836-2df2063e57b5-metrics-tls\") pod \"dns-operator-744455d44c-c6bvx\" (UID: \"6639342f-1d7c-4b9f-9836-2df2063e57b5\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.118132 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.118180 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20246915-b29b-42bf-871e-81fcf4b2da46-serving-cert\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.118449 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-serving-cert\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.119206 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-srv-cert\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.119337 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bab77e0c-a6e8-4e8b-a036-695cda94d7db-default-certificate\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.119953 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-machine-approver-tls\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.119961 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.124909 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.145803 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.150265 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf78d4e-beb5-4487-b728-e34230363308-config\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.185966 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjr4l\" (UniqueName: \"kubernetes.io/projected/5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0-kube-api-access-pjr4l\") pod \"cluster-samples-operator-665b6dd947-dh5kz\" (UID: \"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.200253 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhrh\" (UniqueName: \"kubernetes.io/projected/34df260e-28ff-4766-a6ea-5e8df0d34060-kube-api-access-qvhrh\") pod \"controller-manager-879f6c89f-bz2cr\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.224623 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.230621 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqg2r\" (UniqueName: \"kubernetes.io/projected/5ae0b73c-e430-44fb-81b7-9fe4284dc73e-kube-api-access-dqg2r\") pod \"apiserver-76f77b778f-h2kpt\" (UID: \"5ae0b73c-e430-44fb-81b7-9fe4284dc73e\") " pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.245839 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.266557 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.276641 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.279840 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-serving-cert\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.284602 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.290754 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-config\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.304913 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.334253 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.348061 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.349696 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.359423 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-service-ca-bundle\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.370243 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.385787 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spjjc\" (UniqueName: \"kubernetes.io/projected/062df717-6373-434b-a199-26182e191332-kube-api-access-spjjc\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.412772 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/062df717-6373-434b-a199-26182e191332-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-sgbvv\" (UID: \"062df717-6373-434b-a199-26182e191332\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.420760 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85v4x\" (UniqueName: \"kubernetes.io/projected/d5fa4203-40f4-4fb7-a79a-b983415cd996-kube-api-access-85v4x\") pod \"apiserver-7bbb656c7d-wg2s2\" (UID: \"d5fa4203-40f4-4fb7-a79a-b983415cd996\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.442511 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxhkd\" (UniqueName: \"kubernetes.io/projected/e9a5a3c6-1212-40e7-be6c-c311f3fed92e-kube-api-access-nxhkd\") pod \"openshift-controller-manager-operator-756b6f6bc6-r52q4\" (UID: \"e9a5a3c6-1212-40e7-be6c-c311f3fed92e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.453401 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.464256 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lppqq\" (UniqueName: \"kubernetes.io/projected/a7403eeb-ce97-4c7c-8bab-b69bff33c2f7-kube-api-access-lppqq\") pod \"openshift-apiserver-operator-796bbdcf4f-skx79\" (UID: \"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.475831 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.476008 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.484290 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kgd5\" (UniqueName: \"kubernetes.io/projected/ca012f19-1dcd-41c8-8e17-bb98db200573-kube-api-access-4kgd5\") pod \"downloads-7954f5f757-mhz6l\" (UID: \"ca012f19-1dcd-41c8-8e17-bb98db200573\") " pod="openshift-console/downloads-7954f5f757-mhz6l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.500669 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbjhl\" (UniqueName: \"kubernetes.io/projected/855d410e-6475-4d9c-b523-7fa091254a84-kube-api-access-vbjhl\") pod \"openshift-config-operator-7777fb866f-zxzbs\" (UID: \"855d410e-6475-4d9c-b523-7fa091254a84\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.515288 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.527120 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.546030 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.567858 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.573762 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.583016 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h2kpt"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.584309 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.596759 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.604810 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.625185 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.629753 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.643569 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" event={"ID":"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0","Type":"ContainerStarted","Data":"afe568b20ac36cc7f0e36bb21075badb1496039ea14ed79e4ef2149d077a518f"} Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.645352 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.651919 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bz2cr"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.655105 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/28316cb3-4478-424c-bf38-43d5645ee769-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.659650 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" event={"ID":"5ae0b73c-e430-44fb-81b7-9fe4284dc73e","Type":"ContainerStarted","Data":"67b638a703170bea0a0f1f7c950886e1760202f7a59cd793db935ef73472e953"} Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.664864 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.670510 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28316cb3-4478-424c-bf38-43d5645ee769-config\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: W0311 18:52:56.679392 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34df260e_28ff_4766_a6ea_5e8df0d34060.slice/crio-25951ced7d2a1b1ea95a8672ad460d20f7b3522209821042a14b5992280adab8 WatchSource:0}: Error finding container 25951ced7d2a1b1ea95a8672ad460d20f7b3522209821042a14b5992280adab8: Status 404 returned error can't find the container with id 25951ced7d2a1b1ea95a8672ad460d20f7b3522209821042a14b5992280adab8 Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.685256 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.692356 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/28316cb3-4478-424c-bf38-43d5645ee769-images\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.708798 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.725095 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.730435 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.732817 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/de8073d6-b58a-41f8-a20e-1de8878ee12a-metrics-tls\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.749674 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.752259 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de8073d6-b58a-41f8-a20e-1de8878ee12a-trusted-ca\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.752763 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mhz6l" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.765028 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.785813 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.789899 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.809603 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.826336 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.846579 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.851894 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a286812e-873b-4844-a8ee-600ebdf1df1b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.854861 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.865030 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.871637 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.877023 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a286812e-873b-4844-a8ee-600ebdf1df1b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.891026 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.903811 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bb28ce8-aad8-4db8-8492-319989f0059b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h8nlz\" (UID: \"6bb28ce8-aad8-4db8-8492-319989f0059b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.908634 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.925116 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.926291 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-srv-cert\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.930047 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-config-volume\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.943020 4842 request.go:700] Waited for 1.00626779s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcollect-profiles-dockercfg-kzf4t&limit=500&resourceVersion=0 Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.945846 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.948171 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2"] Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.966593 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 18:52:56 crc kubenswrapper[4842]: I0311 18:52:56.985966 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.005148 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.015992 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mhz6l"] Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.025050 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 18:52:57 crc kubenswrapper[4842]: W0311 18:52:57.031019 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca012f19_1dcd_41c8_8e17_bb98db200573.slice/crio-bdd395296cd62d853a8a2d12ee39119bac4e7fecdee229747564bed42ef6b246 WatchSource:0}: Error finding container bdd395296cd62d853a8a2d12ee39119bac4e7fecdee229747564bed42ef6b246: Status 404 returned error can't find the container with id bdd395296cd62d853a8a2d12ee39119bac4e7fecdee229747564bed42ef6b246 Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.044488 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.064939 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.097926 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.103222 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.104509 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.110397 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-config\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:57 crc kubenswrapper[4842]: E0311 18:52:57.111738 4842 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Mar 11 18:52:57 crc kubenswrapper[4842]: E0311 18:52:57.111842 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-config podName:b086465f-d5e3-4a71-93c4-69fb2bb5b32d nodeName:}" failed. No retries permitted until 2026-03-11 18:52:57.611826048 +0000 UTC m=+223.259522328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-config") pod "kube-storage-version-migrator-operator-b67b599dd-hvd8l" (UID: "b086465f-d5e3-4a71-93c4-69fb2bb5b32d") : failed to sync configmap cache: timed out waiting for the condition Mar 11 18:52:57 crc kubenswrapper[4842]: E0311 18:52:57.111943 4842 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 11 18:52:57 crc kubenswrapper[4842]: E0311 18:52:57.112256 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-serving-cert podName:b086465f-d5e3-4a71-93c4-69fb2bb5b32d nodeName:}" failed. No retries permitted until 2026-03-11 18:52:57.61222031 +0000 UTC m=+223.259916820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-hvd8l" (UID: "b086465f-d5e3-4a71-93c4-69fb2bb5b32d") : failed to sync secret cache: timed out waiting for the condition Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.124959 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.144430 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.166043 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.185646 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.205418 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.225286 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.244792 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.263720 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.284870 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.325125 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.344647 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.366699 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.385579 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.405559 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.426009 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.445851 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.465834 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.485168 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.505156 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.524901 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.545326 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.565601 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.585802 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.604958 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.625472 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.641259 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.641410 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.642405 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.648449 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.648710 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.665013 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.666887 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" event={"ID":"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0","Type":"ContainerStarted","Data":"021ac107197c85e8c39c757d9752c5776545a9bb86894ac96d7dea4bd07ae4b1"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.667008 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" event={"ID":"5d11fc16-d0b5-47c4-bad9-c04c26c6a4c0","Type":"ContainerStarted","Data":"e300723e00b2ec723492b06562f0684e4b20db15a14710273d3d58de0cf4be9e"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.668684 4842 generic.go:334] "Generic (PLEG): container finished" podID="855d410e-6475-4d9c-b523-7fa091254a84" containerID="51a5f39b36099f82208c0f34c258c497b2e0b3c7afba205980d9cbf771515233" exitCode=0 Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.668742 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" event={"ID":"855d410e-6475-4d9c-b523-7fa091254a84","Type":"ContainerDied","Data":"51a5f39b36099f82208c0f34c258c497b2e0b3c7afba205980d9cbf771515233"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.668896 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" event={"ID":"855d410e-6475-4d9c-b523-7fa091254a84","Type":"ContainerStarted","Data":"cdafb379746b6aaf897fce0d920db317684c4bf1489659afbe9130ba7d1fa018"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.670621 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mhz6l" event={"ID":"ca012f19-1dcd-41c8-8e17-bb98db200573","Type":"ContainerStarted","Data":"46bc63136542b3a1ac0e1d48c9a24b4fe7cd37882ce10c16b1de17a16f792e93"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.670663 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mhz6l" event={"ID":"ca012f19-1dcd-41c8-8e17-bb98db200573","Type":"ContainerStarted","Data":"bdd395296cd62d853a8a2d12ee39119bac4e7fecdee229747564bed42ef6b246"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.670818 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mhz6l" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.672541 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" event={"ID":"e9a5a3c6-1212-40e7-be6c-c311f3fed92e","Type":"ContainerStarted","Data":"130922282f943af023af6be79a07e1df50e7a0e7ff1db19ece0a70b431f2b513"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.672658 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" event={"ID":"e9a5a3c6-1212-40e7-be6c-c311f3fed92e","Type":"ContainerStarted","Data":"a6ca68ede26cf9b22b97723ef5f9b28ff9836428ebf8601345a3c7574e360e94"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.672960 4842 patch_prober.go:28] interesting pod/downloads-7954f5f757-mhz6l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.673048 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mhz6l" podUID="ca012f19-1dcd-41c8-8e17-bb98db200573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.674841 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" event={"ID":"062df717-6373-434b-a199-26182e191332","Type":"ContainerStarted","Data":"de1626a84e4503883fefe533ff2cfeddca0be1166a9613d693f088198c587eaf"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.674929 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" event={"ID":"062df717-6373-434b-a199-26182e191332","Type":"ContainerStarted","Data":"a5da63ef94e58dfea6be4aca38c683c5a384306182891266e67123f6ae5903ff"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.677256 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" event={"ID":"34df260e-28ff-4766-a6ea-5e8df0d34060","Type":"ContainerStarted","Data":"b86812a85fcfe27ddf869c2b34406da6be4920249840b0370d6d4db2d0c7c80b"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.677318 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" event={"ID":"34df260e-28ff-4766-a6ea-5e8df0d34060","Type":"ContainerStarted","Data":"25951ced7d2a1b1ea95a8672ad460d20f7b3522209821042a14b5992280adab8"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.678060 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.679529 4842 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bz2cr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.679562 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" podUID="34df260e-28ff-4766-a6ea-5e8df0d34060" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.680633 4842 generic.go:334] "Generic (PLEG): container finished" podID="d5fa4203-40f4-4fb7-a79a-b983415cd996" containerID="3e508aba65e4b91890257a54ca47a90870dd63f96e41dc324a31e3026632c083" exitCode=0 Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.680845 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" event={"ID":"d5fa4203-40f4-4fb7-a79a-b983415cd996","Type":"ContainerDied","Data":"3e508aba65e4b91890257a54ca47a90870dd63f96e41dc324a31e3026632c083"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.680893 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" event={"ID":"d5fa4203-40f4-4fb7-a79a-b983415cd996","Type":"ContainerStarted","Data":"cb2c22ffc2083a102dc177c5fdf0e4430d1f1c3de3f68d279686850edc664a89"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.683691 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" event={"ID":"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7","Type":"ContainerStarted","Data":"bb9cef07db1aa979fb9bb169945dfef02dfccbad60d0c72344cb93aeaa973444"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.683728 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" event={"ID":"a7403eeb-ce97-4c7c-8bab-b69bff33c2f7","Type":"ContainerStarted","Data":"8309c80b4df0827b362da70ea7e5697564b23dfaa095181fa684ac8526006382"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.686818 4842 generic.go:334] "Generic (PLEG): container finished" podID="5ae0b73c-e430-44fb-81b7-9fe4284dc73e" containerID="c76d0d206d580bb9267fd3d0d12c4d2c32f9ebdfc99672c2aacf5027faefbd36" exitCode=0 Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.686876 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" event={"ID":"5ae0b73c-e430-44fb-81b7-9fe4284dc73e","Type":"ContainerDied","Data":"c76d0d206d580bb9267fd3d0d12c4d2c32f9ebdfc99672c2aacf5027faefbd36"} Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.692162 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.705175 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.725652 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.753789 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.765907 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.784934 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.805848 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.825661 4842 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.845291 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.864741 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.885593 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.904184 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.929839 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.944327 4842 request.go:700] Waited for 1.844145322s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.948730 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.965512 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 18:52:57 crc kubenswrapper[4842]: I0311 18:52:57.985925 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.005242 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.052250 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4gkk\" (UniqueName: \"kubernetes.io/projected/805d8bd0-afa6-4f33-9a24-b83cefb6fac3-kube-api-access-h4gkk\") pod \"catalog-operator-68c6474976-6vjq5\" (UID: \"805d8bd0-afa6-4f33-9a24-b83cefb6fac3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.090240 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v55jb\" (UniqueName: \"kubernetes.io/projected/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-kube-api-access-v55jb\") pod \"console-f9d7485db-v5z72\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.104994 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0103f6b8-b0b7-4dd4-bb7b-982db80c80ba-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5wmmr\" (UID: \"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.115046 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft44f\" (UniqueName: \"kubernetes.io/projected/6639342f-1d7c-4b9f-9836-2df2063e57b5-kube-api-access-ft44f\") pod \"dns-operator-744455d44c-c6bvx\" (UID: \"6639342f-1d7c-4b9f-9836-2df2063e57b5\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.141830 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4xh\" (UniqueName: \"kubernetes.io/projected/b086465f-d5e3-4a71-93c4-69fb2bb5b32d-kube-api-access-qd4xh\") pod \"kube-storage-version-migrator-operator-b67b599dd-hvd8l\" (UID: \"b086465f-d5e3-4a71-93c4-69fb2bb5b32d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.170602 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdd2\" (UniqueName: \"kubernetes.io/projected/27e3ec0d-2ef4-41f5-9c71-c73193bf1279-kube-api-access-8tdd2\") pod \"etcd-operator-b45778765-tl2vq\" (UID: \"27e3ec0d-2ef4-41f5-9c71-c73193bf1279\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.172632 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.182692 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.198211 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mxgm\" (UniqueName: \"kubernetes.io/projected/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-kube-api-access-9mxgm\") pod \"collect-profiles-29554245-ch5xt\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.198837 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dl8s\" (UniqueName: \"kubernetes.io/projected/00e5f3ae-b5e8-463b-ba37-de0d154e5ade-kube-api-access-4dl8s\") pod \"olm-operator-6b444d44fb-c9s2x\" (UID: \"00e5f3ae-b5e8-463b-ba37-de0d154e5ade\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.200984 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.209096 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.229749 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfz8\" (UniqueName: \"kubernetes.io/projected/d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b-kube-api-access-mpfz8\") pod \"machine-approver-56656f9798-5lrdp\" (UID: \"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.230207 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb77p\" (UniqueName: \"kubernetes.io/projected/28316cb3-4478-424c-bf38-43d5645ee769-kube-api-access-xb77p\") pod \"machine-api-operator-5694c8668f-fmkft\" (UID: \"28316cb3-4478-424c-bf38-43d5645ee769\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.241474 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.251573 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sdct\" (UniqueName: \"kubernetes.io/projected/20246915-b29b-42bf-871e-81fcf4b2da46-kube-api-access-9sdct\") pod \"console-operator-58897d9998-f9m6d\" (UID: \"20246915-b29b-42bf-871e-81fcf4b2da46\") " pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.269422 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp6lp\" (UniqueName: \"kubernetes.io/projected/bab77e0c-a6e8-4e8b-a036-695cda94d7db-kube-api-access-hp6lp\") pod \"router-default-5444994796-xgzxx\" (UID: \"bab77e0c-a6e8-4e8b-a036-695cda94d7db\") " pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.290559 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.291886 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.298714 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de8073d6-b58a-41f8-a20e-1de8878ee12a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.314312 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.317663 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a286812e-873b-4844-a8ee-600ebdf1df1b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ws6wr\" (UID: \"a286812e-873b-4844-a8ee-600ebdf1df1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.331568 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.331855 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wb8\" (UniqueName: \"kubernetes.io/projected/043b0177-5671-4fb6-9ffa-5ebe76d5e0f1-kube-api-access-g8wb8\") pod \"authentication-operator-69f744f599-4glcd\" (UID: \"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.352253 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zphsc\" (UniqueName: \"kubernetes.io/projected/6bb28ce8-aad8-4db8-8492-319989f0059b-kube-api-access-zphsc\") pod \"package-server-manager-789f6589d5-h8nlz\" (UID: \"6bb28ce8-aad8-4db8-8492-319989f0059b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.367013 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cf78d4e-beb5-4487-b728-e34230363308-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kdbnc\" (UID: \"9cf78d4e-beb5-4487-b728-e34230363308\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.388413 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rjr\" (UniqueName: \"kubernetes.io/projected/148bd39e-58ee-4a7f-aa9c-8435ab50d862-kube-api-access-58rjr\") pod \"oauth-openshift-558db77b4-2flrr\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.434611 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g55xf\" (UniqueName: \"kubernetes.io/projected/de8073d6-b58a-41f8-a20e-1de8878ee12a-kube-api-access-g55xf\") pod \"ingress-operator-5b745b69d9-s8v9m\" (UID: \"de8073d6-b58a-41f8-a20e-1de8878ee12a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.441145 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4tnz\" (UniqueName: \"kubernetes.io/projected/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-kube-api-access-b4tnz\") pod \"route-controller-manager-6576b87f9c-4hgzc\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.448480 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bz2cr"] Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.451808 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.472924 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474330 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-registry-tls\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474355 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0814e2b5-a228-48a8-8dfb-b6a8f51454d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ntf4r\" (UID: \"0814e2b5-a228-48a8-8dfb-b6a8f51454d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474371 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c45514f-e876-486d-9e85-488f53adfdd1-proxy-tls\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474392 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9hng8\" (UID: \"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474504 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1882c06f-22f3-4346-8435-418f034f7d09-ca-trust-extracted\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474563 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5kwf\" (UniqueName: \"kubernetes.io/projected/0814e2b5-a228-48a8-8dfb-b6a8f51454d2-kube-api-access-p5kwf\") pod \"multus-admission-controller-857f4d67dd-ntf4r\" (UID: \"0814e2b5-a228-48a8-8dfb-b6a8f51454d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474585 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474636 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-bound-sa-token\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474718 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c45514f-e876-486d-9e85-488f53adfdd1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474782 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474811 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1882c06f-22f3-4346-8435-418f034f7d09-installation-pull-secrets\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474839 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swkmf\" (UniqueName: \"kubernetes.io/projected/8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55-kube-api-access-swkmf\") pod \"control-plane-machine-set-operator-78cbb6b69f-9hng8\" (UID: \"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474868 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krnkh\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-kube-api-access-krnkh\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474895 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lswvc\" (UniqueName: \"kubernetes.io/projected/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-kube-api-access-lswvc\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474929 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-registry-certificates\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474946 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-images\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474965 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-proxy-tls\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.474996 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wv8m\" (UniqueName: \"kubernetes.io/projected/0c45514f-e876-486d-9e85-488f53adfdd1-kube-api-access-4wv8m\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.475062 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-trusted-ca\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: E0311 18:52:58.488256 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:52:58.988236971 +0000 UTC m=+224.635933251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.489862 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.498881 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.521963 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.525479 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.533622 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.534074 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc"] Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.558799 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.566403 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.579283 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:58 crc kubenswrapper[4842]: E0311 18:52:58.579452 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.079435496 +0000 UTC m=+224.727131776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.579703 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.583023 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9hng8\" (UID: \"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.583070 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1882c06f-22f3-4346-8435-418f034f7d09-ca-trust-extracted\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.583115 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5kwf\" (UniqueName: \"kubernetes.io/projected/0814e2b5-a228-48a8-8dfb-b6a8f51454d2-kube-api-access-p5kwf\") pod \"multus-admission-controller-857f4d67dd-ntf4r\" (UID: \"0814e2b5-a228-48a8-8dfb-b6a8f51454d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.583155 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.583198 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e65b875-9006-4ade-b142-411cbad664e5-signing-key\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.583220 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-csi-data-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.583246 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-plugins-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.586040 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.586590 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1882c06f-22f3-4346-8435-418f034f7d09-ca-trust-extracted\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.600353 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-bound-sa-token\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.607480 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9hng8\" (UID: \"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.612029 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c45514f-e876-486d-9e85-488f53adfdd1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.612140 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e65b875-9006-4ade-b142-411cbad664e5-signing-cabundle\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.612222 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-socket-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.612327 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.612352 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1882c06f-22f3-4346-8435-418f034f7d09-installation-pull-secrets\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614031 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swkmf\" (UniqueName: \"kubernetes.io/projected/8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55-kube-api-access-swkmf\") pod \"control-plane-machine-set-operator-78cbb6b69f-9hng8\" (UID: \"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614149 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th8b8\" (UniqueName: \"kubernetes.io/projected/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-kube-api-access-th8b8\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614232 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-mountpoint-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614514 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krnkh\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-kube-api-access-krnkh\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614588 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lswvc\" (UniqueName: \"kubernetes.io/projected/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-kube-api-access-lswvc\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614678 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-registry-certificates\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614736 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-images\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614763 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-proxy-tls\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.614837 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wv8m\" (UniqueName: \"kubernetes.io/projected/0c45514f-e876-486d-9e85-488f53adfdd1-kube-api-access-4wv8m\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.625208 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqmm4\" (UniqueName: \"kubernetes.io/projected/3e65b875-9006-4ade-b142-411cbad664e5-kube-api-access-jqmm4\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: E0311 18:52:58.627783 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.127763397 +0000 UTC m=+224.775459677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.630090 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-images\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.639957 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-proxy-tls\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.641740 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-trusted-ca\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.643209 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5kwf\" (UniqueName: \"kubernetes.io/projected/0814e2b5-a228-48a8-8dfb-b6a8f51454d2-kube-api-access-p5kwf\") pod \"multus-admission-controller-857f4d67dd-ntf4r\" (UID: \"0814e2b5-a228-48a8-8dfb-b6a8f51454d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.645907 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-trusted-ca\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.645973 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-registry-certificates\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.646309 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-bound-sa-token\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.646680 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-registry-tls\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.650418 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-registration-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.654105 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0814e2b5-a228-48a8-8dfb-b6a8f51454d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ntf4r\" (UID: \"0814e2b5-a228-48a8-8dfb-b6a8f51454d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.654172 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c45514f-e876-486d-9e85-488f53adfdd1-proxy-tls\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.667719 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-registry-tls\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.668032 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c45514f-e876-486d-9e85-488f53adfdd1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.672635 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0814e2b5-a228-48a8-8dfb-b6a8f51454d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ntf4r\" (UID: \"0814e2b5-a228-48a8-8dfb-b6a8f51454d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.682697 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1882c06f-22f3-4346-8435-418f034f7d09-installation-pull-secrets\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.686574 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c45514f-e876-486d-9e85-488f53adfdd1-proxy-tls\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.691420 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wv8m\" (UniqueName: \"kubernetes.io/projected/0c45514f-e876-486d-9e85-488f53adfdd1-kube-api-access-4wv8m\") pod \"machine-config-controller-84d6567774-9zbwd\" (UID: \"0c45514f-e876-486d-9e85-488f53adfdd1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.732636 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" event={"ID":"855d410e-6475-4d9c-b523-7fa091254a84","Type":"ContainerStarted","Data":"406d734993654ef8a76a656ef2220099ee69ad3e5e6ed06a89878172e42f04d2"} Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.733180 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.735906 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lswvc\" (UniqueName: \"kubernetes.io/projected/edfc5578-a5cb-4a87-bac7-5b82bcd564c1-kube-api-access-lswvc\") pod \"machine-config-operator-74547568cd-ktfqw\" (UID: \"edfc5578-a5cb-4a87-bac7-5b82bcd564c1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.752163 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krnkh\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-kube-api-access-krnkh\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758369 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758489 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxp4\" (UniqueName: \"kubernetes.io/projected/17ad1809-8d6f-414d-9220-4cc69d21544e-kube-api-access-8hxp4\") pod \"migrator-59844c95c7-wm6kt\" (UID: \"17ad1809-8d6f-414d-9220-4cc69d21544e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758534 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a31fa9c-5154-4cd2-a877-05e22e173922-apiservice-cert\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758551 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7scs\" (UniqueName: \"kubernetes.io/projected/bc420b71-1e91-4239-b690-7901661380ef-kube-api-access-p7scs\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758568 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqmm4\" (UniqueName: \"kubernetes.io/projected/3e65b875-9006-4ade-b142-411cbad664e5-kube-api-access-jqmm4\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758583 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfsz\" (UniqueName: \"kubernetes.io/projected/30a9b79e-4043-4dc7-b625-53e0962a745b-kube-api-access-kcfsz\") pod \"auto-csr-approver-29554252-4qdmf\" (UID: \"30a9b79e-4043-4dc7-b625-53e0962a745b\") " pod="openshift-infra/auto-csr-approver-29554252-4qdmf" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758600 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758617 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8-cert\") pod \"ingress-canary-7x6n2\" (UID: \"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8\") " pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758638 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c67870e8-4d8e-4c5a-8f5b-6af70b655737-serving-cert\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758652 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67870e8-4d8e-4c5a-8f5b-6af70b655737-config\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758672 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-registration-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758689 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/70abed58-95eb-4b23-b3d8-8a8197874e2c-certs\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758801 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc420b71-1e91-4239-b690-7901661380ef-config-volume\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758958 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e65b875-9006-4ade-b142-411cbad664e5-signing-key\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758979 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtk5z\" (UniqueName: \"kubernetes.io/projected/c67870e8-4d8e-4c5a-8f5b-6af70b655737-kube-api-access-xtk5z\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759043 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-csi-data-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759060 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7a31fa9c-5154-4cd2-a877-05e22e173922-tmpfs\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759188 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-plugins-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759233 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv62m\" (UniqueName: \"kubernetes.io/projected/ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8-kube-api-access-wv62m\") pod \"ingress-canary-7x6n2\" (UID: \"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8\") " pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759294 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759310 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgp6x\" (UniqueName: \"kubernetes.io/projected/6e8e2825-2a37-4731-bc73-4e469bc34334-kube-api-access-pgp6x\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759353 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e65b875-9006-4ade-b142-411cbad664e5-signing-cabundle\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759369 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97pb8\" (UniqueName: \"kubernetes.io/projected/70abed58-95eb-4b23-b3d8-8a8197874e2c-kube-api-access-97pb8\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759389 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-socket-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759439 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th8b8\" (UniqueName: \"kubernetes.io/projected/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-kube-api-access-th8b8\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759455 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-mountpoint-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759470 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/70abed58-95eb-4b23-b3d8-8a8197874e2c-node-bootstrap-token\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759506 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc420b71-1e91-4239-b690-7901661380ef-metrics-tls\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759526 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jgqn\" (UniqueName: \"kubernetes.io/projected/7a31fa9c-5154-4cd2-a877-05e22e173922-kube-api-access-7jgqn\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.759604 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a31fa9c-5154-4cd2-a877-05e22e173922-webhook-cert\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: E0311 18:52:58.759883 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.25985819 +0000 UTC m=+224.907554630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.758677 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swkmf\" (UniqueName: \"kubernetes.io/projected/8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55-kube-api-access-swkmf\") pod \"control-plane-machine-set-operator-78cbb6b69f-9hng8\" (UID: \"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.763244 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-registration-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.765679 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-csi-data-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.766888 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" event={"ID":"d5fa4203-40f4-4fb7-a79a-b983415cd996","Type":"ContainerStarted","Data":"bddecf54583517864e4dc75624c5c37b28a44b37289550c40ff2fb3d03d0de12"} Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.767336 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3e65b875-9006-4ade-b142-411cbad664e5-signing-cabundle\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.772652 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-mountpoint-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.775837 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-socket-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.775956 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-plugins-dir\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.780641 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3e65b875-9006-4ade-b142-411cbad664e5-signing-key\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.785299 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" event={"ID":"5ae0b73c-e430-44fb-81b7-9fe4284dc73e","Type":"ContainerStarted","Data":"eff67722155153a7b4e38bf3b38fdf37d6be76694e801174cf77bd89ecec27ac"} Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.785637 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" event={"ID":"5ae0b73c-e430-44fb-81b7-9fe4284dc73e","Type":"ContainerStarted","Data":"9d316fee9edd4e70041c823216abe0379a95ec3ad5b4c5779567172c1699d9ca"} Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.789000 4842 patch_prober.go:28] interesting pod/downloads-7954f5f757-mhz6l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.789149 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mhz6l" podUID="ca012f19-1dcd-41c8-8e17-bb98db200573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.794470 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th8b8\" (UniqueName: \"kubernetes.io/projected/84bc584f-a1e1-499e-acd5-f3a3aa3efe69-kube-api-access-th8b8\") pod \"csi-hostpathplugin-vlcsj\" (UID: \"84bc584f-a1e1-499e-acd5-f3a3aa3efe69\") " pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.814100 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v5z72"] Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.817626 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.816847 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqmm4\" (UniqueName: \"kubernetes.io/projected/3e65b875-9006-4ade-b142-411cbad664e5-kube-api-access-jqmm4\") pod \"service-ca-9c57cc56f-zkds6\" (UID: \"3e65b875-9006-4ade-b142-411cbad664e5\") " pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873167 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc420b71-1e91-4239-b690-7901661380ef-config-volume\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873227 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtk5z\" (UniqueName: \"kubernetes.io/projected/c67870e8-4d8e-4c5a-8f5b-6af70b655737-kube-api-access-xtk5z\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873252 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7a31fa9c-5154-4cd2-a877-05e22e173922-tmpfs\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873291 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv62m\" (UniqueName: \"kubernetes.io/projected/ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8-kube-api-access-wv62m\") pod \"ingress-canary-7x6n2\" (UID: \"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8\") " pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873317 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873335 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgp6x\" (UniqueName: \"kubernetes.io/projected/6e8e2825-2a37-4731-bc73-4e469bc34334-kube-api-access-pgp6x\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873368 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97pb8\" (UniqueName: \"kubernetes.io/projected/70abed58-95eb-4b23-b3d8-8a8197874e2c-kube-api-access-97pb8\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873385 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873410 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/70abed58-95eb-4b23-b3d8-8a8197874e2c-node-bootstrap-token\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873427 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc420b71-1e91-4239-b690-7901661380ef-metrics-tls\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873446 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jgqn\" (UniqueName: \"kubernetes.io/projected/7a31fa9c-5154-4cd2-a877-05e22e173922-kube-api-access-7jgqn\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873463 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a31fa9c-5154-4cd2-a877-05e22e173922-webhook-cert\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873495 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxp4\" (UniqueName: \"kubernetes.io/projected/17ad1809-8d6f-414d-9220-4cc69d21544e-kube-api-access-8hxp4\") pod \"migrator-59844c95c7-wm6kt\" (UID: \"17ad1809-8d6f-414d-9220-4cc69d21544e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873514 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a31fa9c-5154-4cd2-a877-05e22e173922-apiservice-cert\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873530 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7scs\" (UniqueName: \"kubernetes.io/projected/bc420b71-1e91-4239-b690-7901661380ef-kube-api-access-p7scs\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873548 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfsz\" (UniqueName: \"kubernetes.io/projected/30a9b79e-4043-4dc7-b625-53e0962a745b-kube-api-access-kcfsz\") pod \"auto-csr-approver-29554252-4qdmf\" (UID: \"30a9b79e-4043-4dc7-b625-53e0962a745b\") " pod="openshift-infra/auto-csr-approver-29554252-4qdmf" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873566 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8-cert\") pod \"ingress-canary-7x6n2\" (UID: \"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8\") " pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873580 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873600 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c67870e8-4d8e-4c5a-8f5b-6af70b655737-serving-cert\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873616 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67870e8-4d8e-4c5a-8f5b-6af70b655737-config\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.873637 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/70abed58-95eb-4b23-b3d8-8a8197874e2c-certs\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.898805 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67870e8-4d8e-4c5a-8f5b-6af70b655737-config\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.898933 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: E0311 18:52:58.899355 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.399336943 +0000 UTC m=+225.047033223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.900878 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc420b71-1e91-4239-b690-7901661380ef-config-volume\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.901526 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.901857 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/70abed58-95eb-4b23-b3d8-8a8197874e2c-certs\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.904995 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a31fa9c-5154-4cd2-a877-05e22e173922-webhook-cert\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.906576 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.911372 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tl2vq"] Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.917518 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.920356 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7a31fa9c-5154-4cd2-a877-05e22e173922-tmpfs\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.923313 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c67870e8-4d8e-4c5a-8f5b-6af70b655737-serving-cert\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.928217 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8-cert\") pod \"ingress-canary-7x6n2\" (UID: \"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8\") " pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.929476 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.943933 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.946428 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a31fa9c-5154-4cd2-a877-05e22e173922-apiservice-cert\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.948924 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7scs\" (UniqueName: \"kubernetes.io/projected/bc420b71-1e91-4239-b690-7901661380ef-kube-api-access-p7scs\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.953560 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfsz\" (UniqueName: \"kubernetes.io/projected/30a9b79e-4043-4dc7-b625-53e0962a745b-kube-api-access-kcfsz\") pod \"auto-csr-approver-29554252-4qdmf\" (UID: \"30a9b79e-4043-4dc7-b625-53e0962a745b\") " pod="openshift-infra/auto-csr-approver-29554252-4qdmf" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.956702 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/70abed58-95eb-4b23-b3d8-8a8197874e2c-node-bootstrap-token\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.957034 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.971357 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc420b71-1e91-4239-b690-7901661380ef-metrics-tls\") pod \"dns-default-4pbjx\" (UID: \"bc420b71-1e91-4239-b690-7901661380ef\") " pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.976409 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.977205 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxp4\" (UniqueName: \"kubernetes.io/projected/17ad1809-8d6f-414d-9220-4cc69d21544e-kube-api-access-8hxp4\") pod \"migrator-59844c95c7-wm6kt\" (UID: \"17ad1809-8d6f-414d-9220-4cc69d21544e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" Mar 11 18:52:58 crc kubenswrapper[4842]: I0311 18:52:58.977680 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" Mar 11 18:52:58 crc kubenswrapper[4842]: E0311 18:52:58.977240 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.477205117 +0000 UTC m=+225.124901397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.006793 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.007042 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtk5z\" (UniqueName: \"kubernetes.io/projected/c67870e8-4d8e-4c5a-8f5b-6af70b655737-kube-api-access-xtk5z\") pod \"service-ca-operator-777779d784-zfcwb\" (UID: \"c67870e8-4d8e-4c5a-8f5b-6af70b655737\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.009044 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jgqn\" (UniqueName: \"kubernetes.io/projected/7a31fa9c-5154-4cd2-a877-05e22e173922-kube-api-access-7jgqn\") pod \"packageserver-d55dfcdfc-kfsrt\" (UID: \"7a31fa9c-5154-4cd2-a877-05e22e173922\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.009434 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pbjx" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.015802 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.024567 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97pb8\" (UniqueName: \"kubernetes.io/projected/70abed58-95eb-4b23-b3d8-8a8197874e2c-kube-api-access-97pb8\") pod \"machine-config-server-fz2xd\" (UID: \"70abed58-95eb-4b23-b3d8-8a8197874e2c\") " pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.042654 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fz2xd" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.051131 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgp6x\" (UniqueName: \"kubernetes.io/projected/6e8e2825-2a37-4731-bc73-4e469bc34334-kube-api-access-pgp6x\") pod \"marketplace-operator-79b997595-czxnq\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.086177 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.086844 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.586827078 +0000 UTC m=+225.234523358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.106817 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv62m\" (UniqueName: \"kubernetes.io/projected/ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8-kube-api-access-wv62m\") pod \"ingress-canary-7x6n2\" (UID: \"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8\") " pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.166953 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6bvx"] Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.171123 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fmkft"] Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.189023 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.189483 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.68946482 +0000 UTC m=+225.337161100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.245417 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt"] Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.256793 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.286908 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.291777 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.292103 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.792090781 +0000 UTC m=+225.439787061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.301922 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x"] Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.302333 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.340327 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7x6n2" Mar 11 18:52:59 crc kubenswrapper[4842]: W0311 18:52:59.351505 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28316cb3_4478_424c_bf38_43d5645ee769.slice/crio-2edce2748bce0249e784c8dd7ed7e3336b0a746e081e6f67fa5447785287cacc WatchSource:0}: Error finding container 2edce2748bce0249e784c8dd7ed7e3336b0a746e081e6f67fa5447785287cacc: Status 404 returned error can't find the container with id 2edce2748bce0249e784c8dd7ed7e3336b0a746e081e6f67fa5447785287cacc Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.392786 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.393380 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.893357024 +0000 UTC m=+225.541053304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.476767 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5"] Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.498133 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.498586 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:52:59.998545658 +0000 UTC m=+225.646241938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.504343 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l"] Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.599773 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.600089 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.100073008 +0000 UTC m=+225.747769278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.673584 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" podStartSLOduration=161.673557937 podStartE2EDuration="2m41.673557937s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:59.672799965 +0000 UTC m=+225.320496245" watchObservedRunningTime="2026-03-11 18:52:59.673557937 +0000 UTC m=+225.321254207" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.701015 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.701402 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.201389492 +0000 UTC m=+225.849085772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.804027 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.804439 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.304411014 +0000 UTC m=+225.952107294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.813820 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" event={"ID":"28316cb3-4478-424c-bf38-43d5645ee769","Type":"ContainerStarted","Data":"2edce2748bce0249e784c8dd7ed7e3336b0a746e081e6f67fa5447785287cacc"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.829663 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" event={"ID":"27e3ec0d-2ef4-41f5-9c71-c73193bf1279","Type":"ContainerStarted","Data":"ea01d85a4fee1ac39deaef199f5192f95a8f8b7c44d76abca1c8c9fbbd233d1b"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.836193 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xgzxx" event={"ID":"bab77e0c-a6e8-4e8b-a036-695cda94d7db","Type":"ContainerStarted","Data":"7f7c2a020a4c96e70bf2cd5d5abb08f33a12c5eeeadc0b8d74d19cba68be201e"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.838357 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5z72" event={"ID":"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86","Type":"ContainerStarted","Data":"8e449753f2f70334d54ab85a1f093c5f791bc358184e45de0e50cc9543e40998"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.840018 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" event={"ID":"6639342f-1d7c-4b9f-9836-2df2063e57b5","Type":"ContainerStarted","Data":"ea7c5016e6a83f1bb3fac0c19561910a5e0f777112542aee191fae80d272cbeb"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.871167 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" event={"ID":"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b","Type":"ContainerStarted","Data":"06f4a1c232d53a5e71190c999f7f604ef30e43bc7da6d5aa449ed6ef6415ab3f"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.875225 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skx79" podStartSLOduration=161.875206266 podStartE2EDuration="2m41.875206266s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:59.871024967 +0000 UTC m=+225.518721247" watchObservedRunningTime="2026-03-11 18:52:59.875206266 +0000 UTC m=+225.522902536" Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.894148 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" event={"ID":"a2bd84be-1f01-47e4-a35e-4ed993d4be9b","Type":"ContainerStarted","Data":"905c961ad3c43f06971f930404ad9ee6559939aaa0709a7f55dd23e5ac590dcb"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.905212 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:52:59 crc kubenswrapper[4842]: E0311 18:52:59.905614 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.405600074 +0000 UTC m=+226.053296354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.909852 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fz2xd" event={"ID":"70abed58-95eb-4b23-b3d8-8a8197874e2c","Type":"ContainerStarted","Data":"442141658ca938c52c0e6a3f40041f0788ac0b7a8f8c786ef02d81a28e415b7b"} Mar 11 18:52:59 crc kubenswrapper[4842]: I0311 18:52:59.972492 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-dh5kz" podStartSLOduration=161.972471793 podStartE2EDuration="2m41.972471793s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:59.931970197 +0000 UTC m=+225.579666477" watchObservedRunningTime="2026-03-11 18:52:59.972471793 +0000 UTC m=+225.620168073" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.000555 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-r52q4" podStartSLOduration=162.000539685 podStartE2EDuration="2m42.000539685s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:52:59.999787233 +0000 UTC m=+225.647483513" watchObservedRunningTime="2026-03-11 18:53:00.000539685 +0000 UTC m=+225.648235965" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.001641 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" podUID="34df260e-28ff-4766-a6ea-5e8df0d34060" containerName="controller-manager" containerID="cri-o://b86812a85fcfe27ddf869c2b34406da6be4920249840b0370d6d4db2d0c7c80b" gracePeriod=30 Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.001867 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" event={"ID":"00e5f3ae-b5e8-463b-ba37-de0d154e5ade","Type":"ContainerStarted","Data":"751ec9a042a2a0b289aa4ec86f829dd6a0de57da65159293434dd47ccd6805fb"} Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.006973 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.007523 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.507503174 +0000 UTC m=+226.155199454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.029845 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zxzbs" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.108366 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.110345 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.610328331 +0000 UTC m=+226.258024611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.113799 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" podStartSLOduration=162.113783719 podStartE2EDuration="2m42.113783719s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:00.113390758 +0000 UTC m=+225.761087038" watchObservedRunningTime="2026-03-11 18:53:00.113783719 +0000 UTC m=+225.761479999" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.150721 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-sgbvv" podStartSLOduration=162.150702494 podStartE2EDuration="2m42.150702494s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:00.149314644 +0000 UTC m=+225.797010924" watchObservedRunningTime="2026-03-11 18:53:00.150702494 +0000 UTC m=+225.798398774" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.231915 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.232238 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.732212062 +0000 UTC m=+226.379908342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.310954 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" podStartSLOduration=162.310934261 podStartE2EDuration="2m42.310934261s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:00.308866212 +0000 UTC m=+225.956562492" watchObservedRunningTime="2026-03-11 18:53:00.310934261 +0000 UTC m=+225.958630541" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.327623 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr"] Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.335411 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.335924 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.835870893 +0000 UTC m=+226.483567163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.361173 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mhz6l" podStartSLOduration=162.361157605 podStartE2EDuration="2m42.361157605s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:00.335020579 +0000 UTC m=+225.982716859" watchObservedRunningTime="2026-03-11 18:53:00.361157605 +0000 UTC m=+226.008853885" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.382739 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" podStartSLOduration=161.382707281 podStartE2EDuration="2m41.382707281s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:00.362415311 +0000 UTC m=+226.010111591" watchObservedRunningTime="2026-03-11 18:53:00.382707281 +0000 UTC m=+226.030403561" Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.390025 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f9m6d"] Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.423077 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr"] Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.436483 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.436859 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:00.936840987 +0000 UTC m=+226.584537267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.447859 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4glcd"] Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.474632 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2flrr"] Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.538555 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.538953 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.038934783 +0000 UTC m=+226.686631063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: W0311 18:53:00.632836 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043b0177_5671_4fb6_9ffa_5ebe76d5e0f1.slice/crio-e9f85f9468ac145c109520a84cafb1abe96511c35f376fa0ab4bcb7c79215ce4 WatchSource:0}: Error finding container e9f85f9468ac145c109520a84cafb1abe96511c35f376fa0ab4bcb7c79215ce4: Status 404 returned error can't find the container with id e9f85f9468ac145c109520a84cafb1abe96511c35f376fa0ab4bcb7c79215ce4 Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.640515 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.140467713 +0000 UTC m=+226.788163993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.640652 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.641033 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.648650 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.148627396 +0000 UTC m=+226.796323676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.741875 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.742822 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.242804826 +0000 UTC m=+226.890501106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.843874 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.844478 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.344459909 +0000 UTC m=+226.992156189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.944479 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.944707 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.444663491 +0000 UTC m=+227.092359761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:00 crc kubenswrapper[4842]: I0311 18:53:00.945238 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:00 crc kubenswrapper[4842]: E0311 18:53:00.945565 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.445545536 +0000 UTC m=+227.093241816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.027391 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" event={"ID":"20246915-b29b-42bf-871e-81fcf4b2da46","Type":"ContainerStarted","Data":"dc32407e16f4fafa8112085ad56af03bfb5f78549ed373b5072721cf508b4ade"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.027454 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" event={"ID":"20246915-b29b-42bf-871e-81fcf4b2da46","Type":"ContainerStarted","Data":"1ca344ed182d6bd8bc7b267cee88539388c0c9d0baae5be43ba5e7a3a4d0f916"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.027832 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.031823 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" event={"ID":"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b","Type":"ContainerStarted","Data":"9c1cdc963a0329182894e71a3abaf2e60efa3cf7f846f0d2f270693a879b5f3f"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.037591 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" event={"ID":"a2bd84be-1f01-47e4-a35e-4ed993d4be9b","Type":"ContainerStarted","Data":"f0c71b5e37069aa7fd05e8a991e02e22b32f744273fdbe6f859af7480faf366a"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.047077 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.047140 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.047314 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.547257391 +0000 UTC m=+227.194953671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.047403 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.047829 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.547813437 +0000 UTC m=+227.195509717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.049386 4842 patch_prober.go:28] interesting pod/console-operator-58897d9998-f9m6d container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.049446 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" podUID="20246915-b29b-42bf-871e-81fcf4b2da46" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.051889 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" podStartSLOduration=163.051869543 podStartE2EDuration="2m43.051869543s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.049740452 +0000 UTC m=+226.697436732" watchObservedRunningTime="2026-03-11 18:53:01.051869543 +0000 UTC m=+226.699565813" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.059837 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" event={"ID":"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba","Type":"ContainerStarted","Data":"5b427ab1d56198af6d28631c1b5eb0a36f493cf3b39fd67e7813040001c5e5f1"} Mar 11 18:53:01 crc kubenswrapper[4842]: W0311 18:53:01.064562 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8073d6_b58a_41f8_a20e_1de8878ee12a.slice/crio-c97978d0e501cf09e0e8f0671fd550a4d3873a8598db9a277cdd6439a55bd8fe WatchSource:0}: Error finding container c97978d0e501cf09e0e8f0671fd550a4d3873a8598db9a277cdd6439a55bd8fe: Status 404 returned error can't find the container with id c97978d0e501cf09e0e8f0671fd550a4d3873a8598db9a277cdd6439a55bd8fe Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.065656 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" event={"ID":"805d8bd0-afa6-4f33-9a24-b83cefb6fac3","Type":"ContainerStarted","Data":"ecc9ca8535af5b0894075fb051caa41ebdb09372d4b73367715fa7c034e636a5"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.065704 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" event={"ID":"805d8bd0-afa6-4f33-9a24-b83cefb6fac3","Type":"ContainerStarted","Data":"9e3bf990ec683ccf8bdc62d2a3a0cc6a79d62bc39e589d2c823d3b16377c42df"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.067149 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.071060 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" podStartSLOduration=163.071046221 podStartE2EDuration="2m43.071046221s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.067718986 +0000 UTC m=+226.715415276" watchObservedRunningTime="2026-03-11 18:53:01.071046221 +0000 UTC m=+226.718742501" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.077039 4842 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-6vjq5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.077085 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" podUID="805d8bd0-afa6-4f33-9a24-b83cefb6fac3" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.095133 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.098552 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" podStartSLOduration=162.098532486 podStartE2EDuration="2m42.098532486s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.096399675 +0000 UTC m=+226.744095955" watchObservedRunningTime="2026-03-11 18:53:01.098532486 +0000 UTC m=+226.746228756" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.114945 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" event={"ID":"b086465f-d5e3-4a71-93c4-69fb2bb5b32d","Type":"ContainerStarted","Data":"42641acfec9d457ddef7b4848712e5842cffda44ceb48ad8a54b1d9ea35c7146"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.114997 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" event={"ID":"b086465f-d5e3-4a71-93c4-69fb2bb5b32d","Type":"ContainerStarted","Data":"0f0430b8040391d5922d7cce8c5a83969cc902f634be09b8d3a2d10c44dcd419"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.121491 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.136343 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fz2xd" event={"ID":"70abed58-95eb-4b23-b3d8-8a8197874e2c","Type":"ContainerStarted","Data":"79db5d3423f34f285ec9de375c30718b7d3e7f2955b2a1f1269066ee3a621d84"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.137480 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ntf4r"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.141046 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.146915 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" event={"ID":"6639342f-1d7c-4b9f-9836-2df2063e57b5","Type":"ContainerStarted","Data":"1f471e87a45125eaff5714ad7c65ca5e53efe88a56cb5a75d9c312d9f1b33402"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.151687 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.153036 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.653017152 +0000 UTC m=+227.300713432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.156395 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hvd8l" podStartSLOduration=162.156369598 podStartE2EDuration="2m42.156369598s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.136032547 +0000 UTC m=+226.783728827" watchObservedRunningTime="2026-03-11 18:53:01.156369598 +0000 UTC m=+226.804065878" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.156671 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.159936 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5z72" event={"ID":"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86","Type":"ContainerStarted","Data":"f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.180760 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" event={"ID":"00e5f3ae-b5e8-463b-ba37-de0d154e5ade","Type":"ContainerStarted","Data":"e703da6366362fcc25fcefdfcb7e5213d998892a5aca7ad4a781b4cb38bc4a21"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.182023 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.188289 4842 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c9s2x container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.188341 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" podUID="00e5f3ae-b5e8-463b-ba37-de0d154e5ade" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.198984 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fz2xd" podStartSLOduration=6.198961664 podStartE2EDuration="6.198961664s" podCreationTimestamp="2026-03-11 18:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.182238577 +0000 UTC m=+226.829934857" watchObservedRunningTime="2026-03-11 18:53:01.198961664 +0000 UTC m=+226.846657944" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.215411 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.229219 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xgzxx" event={"ID":"bab77e0c-a6e8-4e8b-a036-695cda94d7db","Type":"ContainerStarted","Data":"a90e1082146b399b0e9a63f91d71e5617143290d3854560d07a66c747244eab9"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.248197 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" podStartSLOduration=162.24817864 podStartE2EDuration="2m42.24817864s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.215779565 +0000 UTC m=+226.863475845" watchObservedRunningTime="2026-03-11 18:53:01.24817864 +0000 UTC m=+226.895874920" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.253794 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-config\") pod \"34df260e-28ff-4766-a6ea-5e8df0d34060\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.255261 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34df260e-28ff-4766-a6ea-5e8df0d34060-serving-cert\") pod \"34df260e-28ff-4766-a6ea-5e8df0d34060\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.255466 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-proxy-ca-bundles\") pod \"34df260e-28ff-4766-a6ea-5e8df0d34060\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.255721 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvhrh\" (UniqueName: \"kubernetes.io/projected/34df260e-28ff-4766-a6ea-5e8df0d34060-kube-api-access-qvhrh\") pod \"34df260e-28ff-4766-a6ea-5e8df0d34060\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.256003 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-client-ca\") pod \"34df260e-28ff-4766-a6ea-5e8df0d34060\" (UID: \"34df260e-28ff-4766-a6ea-5e8df0d34060\") " Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.256694 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.256924 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "34df260e-28ff-4766-a6ea-5e8df0d34060" (UID: "34df260e-28ff-4766-a6ea-5e8df0d34060"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.257147 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.258123 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-client-ca" (OuterVolumeSpecName: "client-ca") pod "34df260e-28ff-4766-a6ea-5e8df0d34060" (UID: "34df260e-28ff-4766-a6ea-5e8df0d34060"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.259545 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.759532814 +0000 UTC m=+227.407229094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.271089 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34df260e-28ff-4766-a6ea-5e8df0d34060-kube-api-access-qvhrh" (OuterVolumeSpecName: "kube-api-access-qvhrh") pod "34df260e-28ff-4766-a6ea-5e8df0d34060" (UID: "34df260e-28ff-4766-a6ea-5e8df0d34060"). InnerVolumeSpecName "kube-api-access-qvhrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.278494 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34df260e-28ff-4766-a6ea-5e8df0d34060-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "34df260e-28ff-4766-a6ea-5e8df0d34060" (UID: "34df260e-28ff-4766-a6ea-5e8df0d34060"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.279853 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-config" (OuterVolumeSpecName: "config") pod "34df260e-28ff-4766-a6ea-5e8df0d34060" (UID: "34df260e-28ff-4766-a6ea-5e8df0d34060"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.284020 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" event={"ID":"28316cb3-4478-424c-bf38-43d5645ee769","Type":"ContainerStarted","Data":"bbba35866be87eb1e1d17a10ca38b79e7d1d07116a45874f540ae14d0c467fe9"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.284084 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" event={"ID":"28316cb3-4478-424c-bf38-43d5645ee769","Type":"ContainerStarted","Data":"a84a83e87edacd4fc9e3bb760bd52606e74465fe283fc323afb30c2756a9b75f"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.289462 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" event={"ID":"148bd39e-58ee-4a7f-aa9c-8435ab50d862","Type":"ContainerStarted","Data":"158c32959cf9c93533c19c03164fc807e428b1454af9e711a0588b231b7ed290"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.299737 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-v5z72" podStartSLOduration=163.299713062 podStartE2EDuration="2m43.299713062s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.253647486 +0000 UTC m=+226.901343786" watchObservedRunningTime="2026-03-11 18:53:01.299713062 +0000 UTC m=+226.947409342" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.312174 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" event={"ID":"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1","Type":"ContainerStarted","Data":"e9f85f9468ac145c109520a84cafb1abe96511c35f376fa0ab4bcb7c79215ce4"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.320239 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dd97fc889-68h7m"] Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.320790 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34df260e-28ff-4766-a6ea-5e8df0d34060" containerName="controller-manager" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.320814 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="34df260e-28ff-4766-a6ea-5e8df0d34060" containerName="controller-manager" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.321035 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="34df260e-28ff-4766-a6ea-5e8df0d34060" containerName="controller-manager" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.321889 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.332643 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dd97fc889-68h7m"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.332839 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xgzxx" podStartSLOduration=163.332811067 podStartE2EDuration="2m43.332811067s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.305050575 +0000 UTC m=+226.952746855" watchObservedRunningTime="2026-03-11 18:53:01.332811067 +0000 UTC m=+226.980507347" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.340883 4842 generic.go:334] "Generic (PLEG): container finished" podID="34df260e-28ff-4766-a6ea-5e8df0d34060" containerID="b86812a85fcfe27ddf869c2b34406da6be4920249840b0370d6d4db2d0c7c80b" exitCode=0 Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.340987 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" event={"ID":"34df260e-28ff-4766-a6ea-5e8df0d34060","Type":"ContainerDied","Data":"b86812a85fcfe27ddf869c2b34406da6be4920249840b0370d6d4db2d0c7c80b"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.341020 4842 scope.go:117] "RemoveContainer" containerID="b86812a85fcfe27ddf869c2b34406da6be4920249840b0370d6d4db2d0c7c80b" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.341159 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bz2cr" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.353326 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" event={"ID":"a286812e-873b-4844-a8ee-600ebdf1df1b","Type":"ContainerStarted","Data":"24a4971d3dcc589216f00547145c46e42711efc4abf7ce299ea4739d58438227"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.359069 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.359185 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.85916139 +0000 UTC m=+227.506857670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.366832 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fmkft" podStartSLOduration=162.366803508 podStartE2EDuration="2m42.366803508s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.323050659 +0000 UTC m=+226.970746939" watchObservedRunningTime="2026-03-11 18:53:01.366803508 +0000 UTC m=+227.014499788" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.370364 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.370630 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-config\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.370688 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-proxy-ca-bundles\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.370768 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hjf6\" (UniqueName: \"kubernetes.io/projected/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-kube-api-access-8hjf6\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.370952 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-serving-cert\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.371095 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.371192 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-client-ca\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.371253 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.371517 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34df260e-28ff-4766-a6ea-5e8df0d34060-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.371544 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34df260e-28ff-4766-a6ea-5e8df0d34060-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.371561 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvhrh\" (UniqueName: \"kubernetes.io/projected/34df260e-28ff-4766-a6ea-5e8df0d34060-kube-api-access-qvhrh\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.372858 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.872843881 +0000 UTC m=+227.520540161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.373334 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.376123 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" event={"ID":"27e3ec0d-2ef4-41f5-9c71-c73193bf1279","Type":"ContainerStarted","Data":"a3122fdf322bf0a9b4f618173aeb90b861a6993ce70dcb204a9caab193441028"} Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.408624 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" podStartSLOduration=163.408601592 podStartE2EDuration="2m43.408601592s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.358972645 +0000 UTC m=+227.006668925" watchObservedRunningTime="2026-03-11 18:53:01.408601592 +0000 UTC m=+227.056297872" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.422419 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tl2vq" podStartSLOduration=163.422399486 podStartE2EDuration="2m43.422399486s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:01.415081257 +0000 UTC m=+227.062777537" watchObservedRunningTime="2026-03-11 18:53:01.422399486 +0000 UTC m=+227.070095766" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.438482 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zkds6"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.463414 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bz2cr"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.471497 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.471558 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.472635 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.475684 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.975657827 +0000 UTC m=+227.623354107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.473138 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-serving-cert\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.479172 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.479340 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-client-ca\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.479541 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-config\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.479598 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-proxy-ca-bundles\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.479674 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hjf6\" (UniqueName: \"kubernetes.io/projected/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-kube-api-access-8hjf6\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.480531 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:01.980514156 +0000 UTC m=+227.628210436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.493351 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.499725 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-client-ca\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.502557 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bz2cr"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.530110 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-proxy-ca-bundles\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.530132 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-config\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.530430 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:01 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:01 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:01 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.530493 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.530583 4842 patch_prober.go:28] interesting pod/apiserver-76f77b778f-h2kpt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]log ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]etcd ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/generic-apiserver-start-informers ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/max-in-flight-filter ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 11 18:53:01 crc kubenswrapper[4842]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 11 18:53:01 crc kubenswrapper[4842]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/project.openshift.io-projectcache ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/openshift.io-startinformers ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 11 18:53:01 crc kubenswrapper[4842]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 11 18:53:01 crc kubenswrapper[4842]: livez check failed Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.530741 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" podUID="5ae0b73c-e430-44fb-81b7-9fe4284dc73e" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.531045 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-serving-cert\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.532345 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.534667 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hjf6\" (UniqueName: \"kubernetes.io/projected/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-kube-api-access-8hjf6\") pod \"controller-manager-7dd97fc889-68h7m\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.576515 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.585827 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.586285 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.086256306 +0000 UTC m=+227.733952586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.587733 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7x6n2"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.589977 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.600813 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.601691 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pbjx"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.605183 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554252-4qdmf"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.610767 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.610811 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vlcsj"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.610822 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-czxnq"] Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.631693 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.632624 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:53:01 crc kubenswrapper[4842]: W0311 18:53:01.634220 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb0fc17_b6ca_4065_86bf_4cbcc78a91c8.slice/crio-986dba7d764c03452e8abf36a1b6d926c6f5fc06712c379bbf6d5a3d1088c983 WatchSource:0}: Error finding container 986dba7d764c03452e8abf36a1b6d926c6f5fc06712c379bbf6d5a3d1088c983: Status 404 returned error can't find the container with id 986dba7d764c03452e8abf36a1b6d926c6f5fc06712c379bbf6d5a3d1088c983 Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.668138 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.670472 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.688052 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.690302 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.190263277 +0000 UTC m=+227.837959557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.735636 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.789719 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.790180 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.29016336 +0000 UTC m=+227.937859640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.892065 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.892401 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.39238863 +0000 UTC m=+228.040084910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.997050 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.997852 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.497824352 +0000 UTC m=+228.145520632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:01 crc kubenswrapper[4842]: I0311 18:53:01.997938 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:01 crc kubenswrapper[4842]: E0311 18:53:01.998605 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.498593184 +0000 UTC m=+228.146289464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.006838 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.102978 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.103895 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.603880361 +0000 UTC m=+228.251576641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.185148 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dd97fc889-68h7m"] Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.204141 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.204442 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.704430923 +0000 UTC m=+228.352127203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.304894 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.305295 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.805279123 +0000 UTC m=+228.452975403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.393855 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" event={"ID":"3e65b875-9006-4ade-b142-411cbad664e5","Type":"ContainerStarted","Data":"9fdfd8dd889270ed81906d787cf7880fb9809bb54725d2c342d7bc9438aaaae1"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.394357 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" event={"ID":"3e65b875-9006-4ade-b142-411cbad664e5","Type":"ContainerStarted","Data":"040dbb5d3021cdb616ecb1742ba2783c1ce9790128210558555b82037c83b3cc"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.405633 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pbjx" event={"ID":"bc420b71-1e91-4239-b690-7901661380ef","Type":"ContainerStarted","Data":"fe314311588be5bd90aa0530007981980d4d5960d9f5cae94a3e3a53cf141fac"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.406106 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.406363 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:02.90635153 +0000 UTC m=+228.554047810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.421223 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" event={"ID":"6bb28ce8-aad8-4db8-8492-319989f0059b","Type":"ContainerStarted","Data":"7f7ca2eadfeca3153fe59c5688605bc52e590ea81bd2004ecb20c2f7c5a45237"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.421290 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" event={"ID":"6bb28ce8-aad8-4db8-8492-319989f0059b","Type":"ContainerStarted","Data":"d0f437bc8387d707d43ae42fa4a3a876ff32a5a79ba3cdeea44c648a68a43f11"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.431248 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zkds6" podStartSLOduration=163.431232401 podStartE2EDuration="2m43.431232401s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.429698307 +0000 UTC m=+228.077394587" watchObservedRunningTime="2026-03-11 18:53:02.431232401 +0000 UTC m=+228.078928681" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.439909 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" event={"ID":"c67870e8-4d8e-4c5a-8f5b-6af70b655737","Type":"ContainerStarted","Data":"5717108f89b150f3a509b1952cd9d98e938064dfd8209be4363e97a765e8511b"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.442215 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" event={"ID":"9cf78d4e-beb5-4487-b728-e34230363308","Type":"ContainerStarted","Data":"cf2876a93081a221e7e347f1120945df57dfd2b5f6c3ae7c38de76998d1f237c"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.456455 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" event={"ID":"a286812e-873b-4844-a8ee-600ebdf1df1b","Type":"ContainerStarted","Data":"5d57be8f521a30361b5961687dd0117147d8cbbe0cdb93cb7d4e5f62b74bfedc"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.460667 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4glcd" event={"ID":"043b0177-5671-4fb6-9ffa-5ebe76d5e0f1","Type":"ContainerStarted","Data":"09995e33d73cd6701d2a4c55953d6f88a62e149457b96653bab13de1b3eb036d"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.462104 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" event={"ID":"0814e2b5-a228-48a8-8dfb-b6a8f51454d2","Type":"ContainerStarted","Data":"1b29fbf2b999f248facc33612f6c04a43c9bd35b5af692a2087d4eb0f9b2634f"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.463937 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" event={"ID":"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55","Type":"ContainerStarted","Data":"01f3b5473a024ed2acb11daaa4b1d80a64667105e5295b3f0ac2e5b4d6932e44"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.465125 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" event={"ID":"6e8e2825-2a37-4731-bc73-4e469bc34334","Type":"ContainerStarted","Data":"bbe847fa4ecb2b077b95ca3e23985e1f04995428ee05e1648d52f5e3aad1f6be"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.466774 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" event={"ID":"7a31fa9c-5154-4cd2-a877-05e22e173922","Type":"ContainerStarted","Data":"f460eb6f5161ee9ff6299ecc4cb8859e2828805741b5885d4d73eac30d8b2946"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.469520 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" event={"ID":"0c45514f-e876-486d-9e85-488f53adfdd1","Type":"ContainerStarted","Data":"369f9aca1838b287b0b2382636cb275fe96e70a96dfb3a0d72e92ae23fba9b6c"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.469546 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" event={"ID":"0c45514f-e876-486d-9e85-488f53adfdd1","Type":"ContainerStarted","Data":"37801e46c20667ac68156689212ead029315e74f422058e11f490c3b11341e58"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.474928 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ws6wr" podStartSLOduration=164.474906758 podStartE2EDuration="2m44.474906758s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.474851617 +0000 UTC m=+228.122547897" watchObservedRunningTime="2026-03-11 18:53:02.474906758 +0000 UTC m=+228.122603038" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.478232 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" event={"ID":"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83","Type":"ContainerStarted","Data":"da041c5ba19ee086032a67b90c60a3ec98da38c59e9b33d05749f35478c9ac09"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.478306 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" event={"ID":"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83","Type":"ContainerStarted","Data":"415ac27cb7bccd09f23596f2cbe40adb213e04466cc836639ffe4ccd33edcf9f"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.478528 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" podUID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" containerName="route-controller-manager" containerID="cri-o://da041c5ba19ee086032a67b90c60a3ec98da38c59e9b33d05749f35478c9ac09" gracePeriod=30 Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.479225 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.488048 4842 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4hgzc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.488112 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" podUID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.491023 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" event={"ID":"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb","Type":"ContainerStarted","Data":"be3b15c1a70e0f27d068d2a55f8b18932280a1e83dbaf1e0c3a7bebf2bbeb13a"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.498705 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" event={"ID":"17ad1809-8d6f-414d-9220-4cc69d21544e","Type":"ContainerStarted","Data":"66966c8e9b1db6eebe99fad1d5fdd697090b9cd11b17bb8bcd1be22d7d4d0a90"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.499815 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:02 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:02 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:02 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.499885 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.501822 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" event={"ID":"6639342f-1d7c-4b9f-9836-2df2063e57b5","Type":"ContainerStarted","Data":"4744ff4021a7e51cb3bdcef23a4bb9059d64c0d1f0f696d2a6227b81c6d74194"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.507324 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.508697 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.008679673 +0000 UTC m=+228.656375943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.527212 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" event={"ID":"d1d3b484-b3a6-48e4-b2f9-bc2c1c4c2a5b","Type":"ContainerStarted","Data":"404c215706dca7ef985fb5b892e593821327eb333f02cc7fb0fb98a22a2e3159"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.529651 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" podStartSLOduration=163.529626361 podStartE2EDuration="2m43.529626361s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.519039629 +0000 UTC m=+228.166735909" watchObservedRunningTime="2026-03-11 18:53:02.529626361 +0000 UTC m=+228.177322641" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.544551 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" event={"ID":"30a9b79e-4043-4dc7-b625-53e0962a745b","Type":"ContainerStarted","Data":"73fe1dddee717ceb6ffe0c2bb9279ed21bc22a67a33ef29a7d6be8bcd7151c55"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.545005 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c6bvx" podStartSLOduration=164.54498783 podStartE2EDuration="2m44.54498783s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.54428862 +0000 UTC m=+228.191984910" watchObservedRunningTime="2026-03-11 18:53:02.54498783 +0000 UTC m=+228.192684110" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.553720 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" event={"ID":"84bc584f-a1e1-499e-acd5-f3a3aa3efe69","Type":"ContainerStarted","Data":"7c7ce9405c7e8dc6195a005c7c2544d9e00b2f5f1ffb42e4f09e62e72d75a695"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.561618 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5lrdp" podStartSLOduration=164.561601864 podStartE2EDuration="2m44.561601864s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.559496284 +0000 UTC m=+228.207192564" watchObservedRunningTime="2026-03-11 18:53:02.561601864 +0000 UTC m=+228.209298144" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.569157 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" event={"ID":"edfc5578-a5cb-4a87-bac7-5b82bcd564c1","Type":"ContainerStarted","Data":"b793b30ed23fa58e8f864b3e7df675a7548bb79164e1e7e6f8f1172875c85326"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.588525 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" event={"ID":"de8073d6-b58a-41f8-a20e-1de8878ee12a","Type":"ContainerStarted","Data":"802720e5338da775882ca960b409dffb1b0ce6cba526195a25f2c206cea3be4a"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.588629 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" event={"ID":"de8073d6-b58a-41f8-a20e-1de8878ee12a","Type":"ContainerStarted","Data":"912d6ea14bf3d1b25522d5e14009981a2071f8a0314cd97a53eb9728ffbe9794"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.588651 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" event={"ID":"de8073d6-b58a-41f8-a20e-1de8878ee12a","Type":"ContainerStarted","Data":"c97978d0e501cf09e0e8f0671fd550a4d3873a8598db9a277cdd6439a55bd8fe"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.608996 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.610213 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.110201172 +0000 UTC m=+228.757897452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.610627 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" event={"ID":"148bd39e-58ee-4a7f-aa9c-8435ab50d862","Type":"ContainerStarted","Data":"3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.612254 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.629621 4842 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2flrr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.629670 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" podUID="148bd39e-58ee-4a7f-aa9c-8435ab50d862" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.632662 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7x6n2" event={"ID":"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8","Type":"ContainerStarted","Data":"986dba7d764c03452e8abf36a1b6d926c6f5fc06712c379bbf6d5a3d1088c983"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.648186 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-s8v9m" podStartSLOduration=164.648163967 podStartE2EDuration="2m44.648163967s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.638566763 +0000 UTC m=+228.286263043" watchObservedRunningTime="2026-03-11 18:53:02.648163967 +0000 UTC m=+228.295860247" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.653138 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" event={"ID":"0103f6b8-b0b7-4dd4-bb7b-982db80c80ba","Type":"ContainerStarted","Data":"9fe2213bb82f9e2231828d8de9497a800cbd9a3376e95dfb9ac474f277fa80e0"} Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.654412 4842 patch_prober.go:28] interesting pod/console-operator-58897d9998-f9m6d container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.654438 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" podUID="20246915-b29b-42bf-871e-81fcf4b2da46" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.674900 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6vjq5" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.677579 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c9s2x" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.680624 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wg2s2" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.710480 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.711779 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.211729362 +0000 UTC m=+228.859425642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.743785 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" podStartSLOduration=164.743766977 podStartE2EDuration="2m44.743766977s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.698672959 +0000 UTC m=+228.346369259" watchObservedRunningTime="2026-03-11 18:53:02.743766977 +0000 UTC m=+228.391463257" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.744084 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5wmmr" podStartSLOduration=164.744078016 podStartE2EDuration="2m44.744078016s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:02.743013216 +0000 UTC m=+228.390709496" watchObservedRunningTime="2026-03-11 18:53:02.744078016 +0000 UTC m=+228.391774296" Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.815919 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.817633 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.317621767 +0000 UTC m=+228.965318047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:02 crc kubenswrapper[4842]: I0311 18:53:02.917828 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:02 crc kubenswrapper[4842]: E0311 18:53:02.918304 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.418284772 +0000 UTC m=+229.065981052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.024013 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.024959 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.524947798 +0000 UTC m=+229.172644078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.126602 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.126898 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.62688243 +0000 UTC m=+229.274578710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.205452 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34df260e-28ff-4766-a6ea-5e8df0d34060" path="/var/lib/kubelet/pods/34df260e-28ff-4766-a6ea-5e8df0d34060/volumes" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.227692 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.228180 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.728164123 +0000 UTC m=+229.375860403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.328661 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.328868 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.828851849 +0000 UTC m=+229.476548129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.329030 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.329376 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.829370053 +0000 UTC m=+229.477066333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.431975 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.931961064 +0000 UTC m=+229.579657344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.431897 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.432545 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.432873 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:03.932865559 +0000 UTC m=+229.580561839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.496996 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:03 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:03 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:03 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.497108 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.533619 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.533799 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.033771751 +0000 UTC m=+229.681468031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.533937 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.534241 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.034228644 +0000 UTC m=+229.681924924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.627331 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45916: no serving certificate available for the kubelet" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.635014 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.635542 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.135523597 +0000 UTC m=+229.783219877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.693340 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" event={"ID":"0c45514f-e876-486d-9e85-488f53adfdd1","Type":"ContainerStarted","Data":"5d5c2bbbf9ab37f890cc0592eb5c0226c10f49e3c1effae8120f0b07cde1481c"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.709922 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7x6n2" event={"ID":"ceb0fc17-b6ca-4065-86bf-4cbcc78a91c8","Type":"ContainerStarted","Data":"47b47b289ce0ed9744052cdc0d7832a6a8bbde78d3a846890fcba47282d7546e"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.721368 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pbjx" event={"ID":"bc420b71-1e91-4239-b690-7901661380ef","Type":"ContainerStarted","Data":"4540fa1afbb223377d2f4ad295f930d69bd55e9220a0163d350436249c35fcf0"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.721409 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pbjx" event={"ID":"bc420b71-1e91-4239-b690-7901661380ef","Type":"ContainerStarted","Data":"f25d881cc3fd07200c0b87fcc971074114c26c4aa16a11cea55129ce22e7302f"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.721954 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4pbjx" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.724557 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9zbwd" podStartSLOduration=165.724547419 podStartE2EDuration="2m45.724547419s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:03.722724497 +0000 UTC m=+229.370420777" watchObservedRunningTime="2026-03-11 18:53:03.724547419 +0000 UTC m=+229.372243699" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.728804 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bg2xr"] Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.729879 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.730502 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45932: no serving certificate available for the kubelet" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.731334 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.734082 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" event={"ID":"c67870e8-4d8e-4c5a-8f5b-6af70b655737","Type":"ContainerStarted","Data":"9e5a0e99fc6b612c928e4dd2c38b3922bb1b36d4b6e24c94e5594794552a5fae"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.736864 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.737308 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.237292094 +0000 UTC m=+229.884988374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.745993 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" event={"ID":"edfc5578-a5cb-4a87-bac7-5b82bcd564c1","Type":"ContainerStarted","Data":"47d72fdd07d132f9e3f5399addcccee4f413614753ad6e3979ba7a82705d6790"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.746032 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" event={"ID":"edfc5578-a5cb-4a87-bac7-5b82bcd564c1","Type":"ContainerStarted","Data":"27614802363cd504bcafd30e4e5c4ffdcc64affd2e1fed03f5d968b698e4bc2c"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.757589 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7x6n2" podStartSLOduration=8.757568183 podStartE2EDuration="8.757568183s" podCreationTimestamp="2026-03-11 18:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:03.757016217 +0000 UTC m=+229.404712497" watchObservedRunningTime="2026-03-11 18:53:03.757568183 +0000 UTC m=+229.405264463" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.765564 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg2xr"] Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.805648 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" event={"ID":"6e8e2825-2a37-4731-bc73-4e469bc34334","Type":"ContainerStarted","Data":"120d6c84408fa4e203d1a983cc8e45f9fcbaadc5eba377764a5c0715f2e1865c"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.806736 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.816470 4842 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-czxnq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.816513 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.837754 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.838180 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnsd\" (UniqueName: \"kubernetes.io/projected/7771ebaa-648a-46c4-986c-2cea25b5b7df-kube-api-access-xmnsd\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.838216 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-utilities\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.838254 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-catalog-content\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.838726 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" event={"ID":"17ad1809-8d6f-414d-9220-4cc69d21544e","Type":"ContainerStarted","Data":"461f1384dc93f9b068d1474d1af960b155293e414da9be2967f097632903de41"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.838759 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" event={"ID":"17ad1809-8d6f-414d-9220-4cc69d21544e","Type":"ContainerStarted","Data":"93a30374ded6f664c8c507fed6adc8421df16eb46d1a1bc286c91b462f5e280c"} Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.839665 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.339648537 +0000 UTC m=+229.987344817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.868395 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" podStartSLOduration=164.868374578 podStartE2EDuration="2m44.868374578s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:03.867782381 +0000 UTC m=+229.515478661" watchObservedRunningTime="2026-03-11 18:53:03.868374578 +0000 UTC m=+229.516070858" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.869227 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4pbjx" podStartSLOduration=8.869219982 podStartE2EDuration="8.869219982s" podCreationTimestamp="2026-03-11 18:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:03.816146806 +0000 UTC m=+229.463843086" watchObservedRunningTime="2026-03-11 18:53:03.869219982 +0000 UTC m=+229.516916262" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.869927 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" event={"ID":"0814e2b5-a228-48a8-8dfb-b6a8f51454d2","Type":"ContainerStarted","Data":"dfc5ba47099721468d53365820f27f31b34b5cf650522d8292e8a74e13b0614c"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.869975 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" event={"ID":"0814e2b5-a228-48a8-8dfb-b6a8f51454d2","Type":"ContainerStarted","Data":"5db33eef060a4cc0033314be0e34befed9aa3580923928cda00acfdfd7257d1d"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.887425 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45940: no serving certificate available for the kubelet" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.918582 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" event={"ID":"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb","Type":"ContainerStarted","Data":"c50ca3d842c66d701753afda09287ccb6b3978fa5243e9feafc8ccb895a39275"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.919507 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.928057 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vb82w"] Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.929080 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.938718 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.939401 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.940959 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-utilities\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.941008 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnsd\" (UniqueName: \"kubernetes.io/projected/7771ebaa-648a-46c4-986c-2cea25b5b7df-kube-api-access-xmnsd\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.941064 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-catalog-content\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.941160 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.941551 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ktfqw" podStartSLOduration=165.941538437 podStartE2EDuration="2m45.941538437s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:03.939227361 +0000 UTC m=+229.586923641" watchObservedRunningTime="2026-03-11 18:53:03.941538437 +0000 UTC m=+229.589234717" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.941790 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-catalog-content\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.942677 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-utilities\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:03 crc kubenswrapper[4842]: E0311 18:53:03.942994 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.442979348 +0000 UTC m=+230.090675708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.943605 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" event={"ID":"84bc584f-a1e1-499e-acd5-f3a3aa3efe69","Type":"ContainerStarted","Data":"1c383bf4154055c342e4ecb64032b52031e89429134fdd047dd404ad086f2150"} Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.956423 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vb82w"] Mar 11 18:53:03 crc kubenswrapper[4842]: I0311 18:53:03.985657 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" event={"ID":"9cf78d4e-beb5-4487-b728-e34230363308","Type":"ContainerStarted","Data":"a981963a1ff1f24c13535d68d05d197521a7cd064b8b68280ad6e0bffa7096cd"} Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.002568 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnsd\" (UniqueName: \"kubernetes.io/projected/7771ebaa-648a-46c4-986c-2cea25b5b7df-kube-api-access-xmnsd\") pod \"certified-operators-bg2xr\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.004696 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zfcwb" podStartSLOduration=165.004678271 podStartE2EDuration="2m45.004678271s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.000522872 +0000 UTC m=+229.648219152" watchObservedRunningTime="2026-03-11 18:53:04.004678271 +0000 UTC m=+229.652374551" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.009406 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45948: no serving certificate available for the kubelet" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.032754 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" event={"ID":"8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55","Type":"ContainerStarted","Data":"16099b287bfa95c57a6b56aee878215f8c72cc0f2af4255e9d6bb6919b8031ad"} Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.051090 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.051340 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-catalog-content\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.051408 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-utilities\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.051570 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzkht\" (UniqueName: \"kubernetes.io/projected/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-kube-api-access-hzkht\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.052558 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.552526167 +0000 UTC m=+230.200222447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.055012 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" event={"ID":"7a31fa9c-5154-4cd2-a877-05e22e173922","Type":"ContainerStarted","Data":"3afa0c7b5cd8b5ca9dff97e7525b4904aeb8cab156a08ceac7878a1a10582688"} Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.055081 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.087706 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" event={"ID":"6bb28ce8-aad8-4db8-8492-319989f0059b","Type":"ContainerStarted","Data":"4c517b714334008ae64982688f7a036e33e3a649031c35dec530b1add0675fd3"} Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.088061 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.109283 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntf4r" podStartSLOduration=165.109254488 podStartE2EDuration="2m45.109254488s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.103778961 +0000 UTC m=+229.751475231" watchObservedRunningTime="2026-03-11 18:53:04.109254488 +0000 UTC m=+229.756950768" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.116151 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.148843 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" podStartSLOduration=5.148826178 podStartE2EDuration="5.148826178s" podCreationTimestamp="2026-03-11 18:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.138356069 +0000 UTC m=+229.786052349" watchObservedRunningTime="2026-03-11 18:53:04.148826178 +0000 UTC m=+229.796522458" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.149324 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7bslz"] Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.150211 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.153033 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzkht\" (UniqueName: \"kubernetes.io/projected/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-kube-api-access-hzkht\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.153210 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-catalog-content\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.153288 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.153355 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-utilities\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.156869 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-catalog-content\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.157916 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.657903087 +0000 UTC m=+230.305599367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.158932 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-utilities\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.169557 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bslz"] Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.196735 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45952: no serving certificate available for the kubelet" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.205296 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kdbnc" podStartSLOduration=166.20525948 podStartE2EDuration="2m46.20525948s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.196976633 +0000 UTC m=+229.844672913" watchObservedRunningTime="2026-03-11 18:53:04.20525948 +0000 UTC m=+229.852955760" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.213690 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzkht\" (UniqueName: \"kubernetes.io/projected/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-kube-api-access-hzkht\") pod \"community-operators-vb82w\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.301886 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.302155 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-utilities\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.302282 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gmt\" (UniqueName: \"kubernetes.io/projected/60214716-6377-46b4-9c9e-adc90ffca659-kube-api-access-d5gmt\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.302315 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-catalog-content\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.302447 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.802423605 +0000 UTC m=+230.450119885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.317566 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.322640 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wm6kt" podStartSLOduration=165.322600021 podStartE2EDuration="2m45.322600021s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.318876535 +0000 UTC m=+229.966572815" watchObservedRunningTime="2026-03-11 18:53:04.322600021 +0000 UTC m=+229.970296301" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.327107 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.375138 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d64lq"] Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.383455 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.403918 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gmt\" (UniqueName: \"kubernetes.io/projected/60214716-6377-46b4-9c9e-adc90ffca659-kube-api-access-d5gmt\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.403954 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-catalog-content\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.404001 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-utilities\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.404020 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.404562 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:04.904550252 +0000 UTC m=+230.552246522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.405202 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-catalog-content\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.413804 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-utilities\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.429480 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d64lq"] Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.442876 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45960: no serving certificate available for the kubelet" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.462468 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gmt\" (UniqueName: \"kubernetes.io/projected/60214716-6377-46b4-9c9e-adc90ffca659-kube-api-access-d5gmt\") pod \"certified-operators-7bslz\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.507479 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" podStartSLOduration=165.507462251 podStartE2EDuration="2m45.507462251s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.448313892 +0000 UTC m=+230.096010172" watchObservedRunningTime="2026-03-11 18:53:04.507462251 +0000 UTC m=+230.155158531" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.529297 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.529546 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gld7g\" (UniqueName: \"kubernetes.io/projected/a0ef18e4-d9a7-4122-89ed-b556ed419954-kube-api-access-gld7g\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.529579 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-utilities\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.529614 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-catalog-content\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.529783 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.029760138 +0000 UTC m=+230.677456418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.533635 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:04 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:04 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:04 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.533692 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.544427 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.567788 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" podStartSLOduration=165.567767304 podStartE2EDuration="2m45.567767304s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.509999834 +0000 UTC m=+230.157696104" watchObservedRunningTime="2026-03-11 18:53:04.567767304 +0000 UTC m=+230.215463594" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.632032 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.632105 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gld7g\" (UniqueName: \"kubernetes.io/projected/a0ef18e4-d9a7-4122-89ed-b556ed419954-kube-api-access-gld7g\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.632124 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-utilities\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.632169 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-catalog-content\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.632585 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-catalog-content\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.632815 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.132802491 +0000 UTC m=+230.780498771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.633345 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-utilities\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.715254 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9hng8" podStartSLOduration=165.715235976 podStartE2EDuration="2m45.715235976s" podCreationTimestamp="2026-03-11 18:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:04.567730743 +0000 UTC m=+230.215427023" watchObservedRunningTime="2026-03-11 18:53:04.715235976 +0000 UTC m=+230.362932256" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.726441 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gld7g\" (UniqueName: \"kubernetes.io/projected/a0ef18e4-d9a7-4122-89ed-b556ed419954-kube-api-access-gld7g\") pod \"community-operators-d64lq\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.736782 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.737171 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.237154952 +0000 UTC m=+230.884851232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.755601 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45976: no serving certificate available for the kubelet" Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.838539 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.838890 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.338875037 +0000 UTC m=+230.986571317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:04 crc kubenswrapper[4842]: I0311 18:53:04.942110 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:04 crc kubenswrapper[4842]: E0311 18:53:04.942398 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.442381593 +0000 UTC m=+231.090077863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.023976 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.045149 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.045510 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.545498029 +0000 UTC m=+231.193194309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.046419 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg2xr"] Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.068386 4842 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kfsrt container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.068443 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" podUID="7a31fa9c-5154-4cd2-a877-05e22e173922" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.143058 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg2xr" event={"ID":"7771ebaa-648a-46c4-986c-2cea25b5b7df","Type":"ContainerStarted","Data":"bb665212a33380ea52581d46f3c02581cf6a50fa3e286ff73dcd41cebc8d7806"} Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.145790 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.146096 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.646081772 +0000 UTC m=+231.293778052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.148606 4842 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-czxnq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.148640 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.158754 4842 ???:1] "http: TLS handshake error from 192.168.126.11:45984: no serving certificate available for the kubelet" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.246763 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.252694 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.752676826 +0000 UTC m=+231.400373106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.319168 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfsrt" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.348541 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.349010 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.848994817 +0000 UTC m=+231.496691087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.369126 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bslz"] Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.450107 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.450977 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:05.950749063 +0000 UTC m=+231.598445343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.470501 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vb82w"] Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.505500 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:05 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:05 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:05 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.505541 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.544157 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d64lq"] Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.551172 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.551373 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.051344677 +0000 UTC m=+231.699040957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.551779 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.552095 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.052084368 +0000 UTC m=+231.699780698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.653221 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.653450 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.153397531 +0000 UTC m=+231.801093811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.653576 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.653946 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.153931687 +0000 UTC m=+231.801627977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: W0311 18:53:05.721816 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ef18e4_d9a7_4122_89ed_b556ed419954.slice/crio-7bcec6492fefc7f7ebe3fd14a5cc50bd50d6229b67a881406a88142ea6a64115 WatchSource:0}: Error finding container 7bcec6492fefc7f7ebe3fd14a5cc50bd50d6229b67a881406a88142ea6a64115: Status 404 returned error can't find the container with id 7bcec6492fefc7f7ebe3fd14a5cc50bd50d6229b67a881406a88142ea6a64115 Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.754344 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.754901 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.25487855 +0000 UTC m=+231.902574830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.846424 4842 ???:1] "http: TLS handshake error from 192.168.126.11:46000: no serving certificate available for the kubelet" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.856753 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.857349 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.357322716 +0000 UTC m=+232.005018986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.919014 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6rrvh"] Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.925485 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rrvh"] Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.925633 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.928136 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.958188 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.958456 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.458432804 +0000 UTC m=+232.106129074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:05 crc kubenswrapper[4842]: I0311 18:53:05.961039 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:05 crc kubenswrapper[4842]: E0311 18:53:05.961825 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.46180044 +0000 UTC m=+232.109496720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.059163 4842 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.064924 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.065070 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.565041039 +0000 UTC m=+232.212737319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.065704 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-utilities\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.065957 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.066041 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jr8f\" (UniqueName: \"kubernetes.io/projected/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-kube-api-access-9jr8f\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.066224 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-catalog-content\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.066255 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.566247653 +0000 UTC m=+232.213943933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.153345 4842 generic.go:334] "Generic (PLEG): container finished" podID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerID="c4da4933ba4b771058697b831475abbcbd9dbc33c9eb105659ae7a216b3cb42d" exitCode=0 Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.153461 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg2xr" event={"ID":"7771ebaa-648a-46c4-986c-2cea25b5b7df","Type":"ContainerDied","Data":"c4da4933ba4b771058697b831475abbcbd9dbc33c9eb105659ae7a216b3cb42d"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.157341 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" event={"ID":"84bc584f-a1e1-499e-acd5-f3a3aa3efe69","Type":"ContainerStarted","Data":"1531ff7c0c7d7d7380008459115c21e63a88b229d33090b9047c89fc1db011b4"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.166774 4842 generic.go:334] "Generic (PLEG): container finished" podID="a2bd84be-1f01-47e4-a35e-4ed993d4be9b" containerID="f0c71b5e37069aa7fd05e8a991e02e22b32f744273fdbe6f859af7480faf366a" exitCode=0 Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.166956 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.166961 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" event={"ID":"a2bd84be-1f01-47e4-a35e-4ed993d4be9b","Type":"ContainerDied","Data":"f0c71b5e37069aa7fd05e8a991e02e22b32f744273fdbe6f859af7480faf366a"} Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.167128 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.667100374 +0000 UTC m=+232.314796654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.167209 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-catalog-content\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.167345 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-utilities\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.167480 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.167517 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jr8f\" (UniqueName: \"kubernetes.io/projected/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-kube-api-access-9jr8f\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.168109 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-utilities\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.168180 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-catalog-content\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.168220 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.668187105 +0000 UTC m=+232.315883585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.169329 4842 generic.go:334] "Generic (PLEG): container finished" podID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerID="204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44" exitCode=0 Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.169575 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d64lq" event={"ID":"a0ef18e4-d9a7-4122-89ed-b556ed419954","Type":"ContainerDied","Data":"204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.169698 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d64lq" event={"ID":"a0ef18e4-d9a7-4122-89ed-b556ed419954","Type":"ContainerStarted","Data":"7bcec6492fefc7f7ebe3fd14a5cc50bd50d6229b67a881406a88142ea6a64115"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.175263 4842 generic.go:334] "Generic (PLEG): container finished" podID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerID="1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4" exitCode=0 Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.175472 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb82w" event={"ID":"ff824009-ab02-4a23-9c8a-76bc3d6a5f04","Type":"ContainerDied","Data":"1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.175544 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb82w" event={"ID":"ff824009-ab02-4a23-9c8a-76bc3d6a5f04","Type":"ContainerStarted","Data":"9e4ff073384f0d36d2fd8f4207a4564fc517f7d52ab24a02318d2152d777079d"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.176943 4842 generic.go:334] "Generic (PLEG): container finished" podID="60214716-6377-46b4-9c9e-adc90ffca659" containerID="386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431" exitCode=0 Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.177151 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bslz" event={"ID":"60214716-6377-46b4-9c9e-adc90ffca659","Type":"ContainerDied","Data":"386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.177175 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bslz" event={"ID":"60214716-6377-46b4-9c9e-adc90ffca659","Type":"ContainerStarted","Data":"7ff95503cf1d4dc05f4e88b1c37fb010aea1c91f62e5a86745dbae366afca348"} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.189580 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.197800 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jr8f\" (UniqueName: \"kubernetes.io/projected/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-kube-api-access-9jr8f\") pod \"redhat-marketplace-6rrvh\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.261617 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.268239 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.268694 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.768650114 +0000 UTC m=+232.416346394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.320978 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wvmzx"] Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.322835 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.331239 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvmzx"] Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.369853 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.371572 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.871558754 +0000 UTC m=+232.519255034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.378383 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.384934 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-h2kpt" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.476430 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.476936 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.976689847 +0000 UTC m=+232.624386127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.477149 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-utilities\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.477219 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: E0311 18:53:06.478713 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 18:53:06.978697674 +0000 UTC m=+232.626394014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-twwzj" (UID: "1882c06f-22f3-4346-8435-418f034f7d09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.482051 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlggr\" (UniqueName: \"kubernetes.io/projected/22097394-ae90-446c-9114-14a1f1d184bb-kube-api-access-rlggr\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.483328 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-catalog-content\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.499948 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:06 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:06 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:06 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.499995 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.515490 4842 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-11T18:53:06.059522121Z","Handler":null,"Name":""} Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.524341 4842 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.524391 4842 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.596597 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.597007 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-utilities\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.597121 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlggr\" (UniqueName: \"kubernetes.io/projected/22097394-ae90-446c-9114-14a1f1d184bb-kube-api-access-rlggr\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.597145 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-catalog-content\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.598177 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-utilities\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.599346 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-catalog-content\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.604499 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.619784 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rrvh"] Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.626546 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlggr\" (UniqueName: \"kubernetes.io/projected/22097394-ae90-446c-9114-14a1f1d184bb-kube-api-access-rlggr\") pod \"redhat-marketplace-wvmzx\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.642053 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.699105 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.703058 4842 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.703108 4842 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.754626 4842 patch_prober.go:28] interesting pod/downloads-7954f5f757-mhz6l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.754646 4842 patch_prober.go:28] interesting pod/downloads-7954f5f757-mhz6l container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.754691 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mhz6l" podUID="ca012f19-1dcd-41c8-8e17-bb98db200573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.754699 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mhz6l" podUID="ca012f19-1dcd-41c8-8e17-bb98db200573" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.756760 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-twwzj\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.917137 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvszz"] Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.919046 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.921702 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.922703 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvmzx"] Mar 11 18:53:06 crc kubenswrapper[4842]: W0311 18:53:06.923218 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22097394_ae90_446c_9114_14a1f1d184bb.slice/crio-93d01703a7e129f0d05259cad617199bf872d703db62fef499d93c075bf2dc14 WatchSource:0}: Error finding container 93d01703a7e129f0d05259cad617199bf872d703db62fef499d93c075bf2dc14: Status 404 returned error can't find the container with id 93d01703a7e129f0d05259cad617199bf872d703db62fef499d93c075bf2dc14 Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.928306 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvszz"] Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.950183 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:06 crc kubenswrapper[4842]: I0311 18:53:06.973148 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.005716 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-utilities\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.005780 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-catalog-content\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.006001 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcthv\" (UniqueName: \"kubernetes.io/projected/7857f1af-d426-446f-a295-05423f407554-kube-api-access-dcthv\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.107297 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-utilities\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.107341 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-catalog-content\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.107432 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcthv\" (UniqueName: \"kubernetes.io/projected/7857f1af-d426-446f-a295-05423f407554-kube-api-access-dcthv\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.107850 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-utilities\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.107927 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-catalog-content\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.128892 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcthv\" (UniqueName: \"kubernetes.io/projected/7857f1af-d426-446f-a295-05423f407554-kube-api-access-dcthv\") pod \"redhat-operators-jvszz\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.157372 4842 ???:1] "http: TLS handshake error from 192.168.126.11:46008: no serving certificate available for the kubelet" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.162993 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-twwzj"] Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.189958 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" event={"ID":"84bc584f-a1e1-499e-acd5-f3a3aa3efe69","Type":"ContainerStarted","Data":"8cd1213c0d9ce7571816c08ce84a9f5797178131e0c36c824f3eb9dcb523e9fe"} Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.190531 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" event={"ID":"84bc584f-a1e1-499e-acd5-f3a3aa3efe69","Type":"ContainerStarted","Data":"d72e09513dcdb20c9e7415d57a1a993c474a53fe0145a7ef2c261075737a4c7e"} Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.199064 4842 generic.go:334] "Generic (PLEG): container finished" podID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerID="2b6c45befea2c91145e4d4a8b4637ccc830b1c282938fdcec488e831b3e07242" exitCode=0 Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.199132 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rrvh" event={"ID":"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7","Type":"ContainerDied","Data":"2b6c45befea2c91145e4d4a8b4637ccc830b1c282938fdcec488e831b3e07242"} Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.199212 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rrvh" event={"ID":"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7","Type":"ContainerStarted","Data":"1d092e09cc5d4c02761f34cf1c4bfba02889d433eb8216b34d4d70671b527366"} Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.200577 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" event={"ID":"1882c06f-22f3-4346-8435-418f034f7d09","Type":"ContainerStarted","Data":"ffc89008597dee3227bf49cd34a9f5ea37e250ae3d21ef0eb0c4643e917eba40"} Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.204418 4842 generic.go:334] "Generic (PLEG): container finished" podID="22097394-ae90-446c-9114-14a1f1d184bb" containerID="54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53" exitCode=0 Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.204569 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvmzx" event={"ID":"22097394-ae90-446c-9114-14a1f1d184bb","Type":"ContainerDied","Data":"54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53"} Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.204636 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvmzx" event={"ID":"22097394-ae90-446c-9114-14a1f1d184bb","Type":"ContainerStarted","Data":"93d01703a7e129f0d05259cad617199bf872d703db62fef499d93c075bf2dc14"} Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.217686 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vlcsj" podStartSLOduration=12.217661269 podStartE2EDuration="12.217661269s" podCreationTimestamp="2026-03-11 18:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:07.212742909 +0000 UTC m=+232.860439189" watchObservedRunningTime="2026-03-11 18:53:07.217661269 +0000 UTC m=+232.865357569" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.237338 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.309765 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hr6hd"] Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.311120 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.320106 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hr6hd"] Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.416395 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-utilities\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.416462 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkkm\" (UniqueName: \"kubernetes.io/projected/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-kube-api-access-ckkkm\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.416595 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-catalog-content\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.468994 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.499517 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:07 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:07 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:07 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.499578 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.515521 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvszz"] Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.518159 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-config-volume\") pod \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.518227 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mxgm\" (UniqueName: \"kubernetes.io/projected/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-kube-api-access-9mxgm\") pod \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.518250 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-secret-volume\") pod \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\" (UID: \"a2bd84be-1f01-47e4-a35e-4ed993d4be9b\") " Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.518688 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-catalog-content\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.518731 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-utilities\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.518755 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkkm\" (UniqueName: \"kubernetes.io/projected/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-kube-api-access-ckkkm\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.519284 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-config-volume" (OuterVolumeSpecName: "config-volume") pod "a2bd84be-1f01-47e4-a35e-4ed993d4be9b" (UID: "a2bd84be-1f01-47e4-a35e-4ed993d4be9b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.520136 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-utilities\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.520527 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-catalog-content\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.524421 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-kube-api-access-9mxgm" (OuterVolumeSpecName: "kube-api-access-9mxgm") pod "a2bd84be-1f01-47e4-a35e-4ed993d4be9b" (UID: "a2bd84be-1f01-47e4-a35e-4ed993d4be9b"). InnerVolumeSpecName "kube-api-access-9mxgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.524574 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a2bd84be-1f01-47e4-a35e-4ed993d4be9b" (UID: "a2bd84be-1f01-47e4-a35e-4ed993d4be9b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.534514 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkkm\" (UniqueName: \"kubernetes.io/projected/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-kube-api-access-ckkkm\") pod \"redhat-operators-hr6hd\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.619977 4842 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.620010 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mxgm\" (UniqueName: \"kubernetes.io/projected/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-kube-api-access-9mxgm\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.620020 4842 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a2bd84be-1f01-47e4-a35e-4ed993d4be9b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.632795 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:53:07 crc kubenswrapper[4842]: I0311 18:53:07.856630 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hr6hd"] Mar 11 18:53:07 crc kubenswrapper[4842]: W0311 18:53:07.881464 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf46a8d85_7384_4cdc_a19d_92a477bcc7d6.slice/crio-dce966aa1a6948d41d28e0906ec625492e4b552c28a098723ddeb90f85e762be WatchSource:0}: Error finding container dce966aa1a6948d41d28e0906ec625492e4b552c28a098723ddeb90f85e762be: Status 404 returned error can't find the container with id dce966aa1a6948d41d28e0906ec625492e4b552c28a098723ddeb90f85e762be Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.027225 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 18:53:08 crc kubenswrapper[4842]: E0311 18:53:08.027517 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2bd84be-1f01-47e4-a35e-4ed993d4be9b" containerName="collect-profiles" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.027591 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2bd84be-1f01-47e4-a35e-4ed993d4be9b" containerName="collect-profiles" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.027735 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2bd84be-1f01-47e4-a35e-4ed993d4be9b" containerName="collect-profiles" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.028121 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.033847 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.033931 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.040824 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.125762 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e46380b-794d-4d39-9f89-dd85644dda2b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.125859 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e46380b-794d-4d39-9f89-dd85644dda2b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.201826 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.201873 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.204988 4842 patch_prober.go:28] interesting pod/console-f9d7485db-v5z72 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.205042 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v5z72" podUID="b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.218919 4842 generic.go:334] "Generic (PLEG): container finished" podID="7857f1af-d426-446f-a295-05423f407554" containerID="05505b8293d6a7b97c802c2d9e9f9d99e8ef9dd8dbc2cd67efe418a6680f8dc4" exitCode=0 Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.219002 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvszz" event={"ID":"7857f1af-d426-446f-a295-05423f407554","Type":"ContainerDied","Data":"05505b8293d6a7b97c802c2d9e9f9d99e8ef9dd8dbc2cd67efe418a6680f8dc4"} Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.219067 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvszz" event={"ID":"7857f1af-d426-446f-a295-05423f407554","Type":"ContainerStarted","Data":"2212ab128c3a804ebbf1814b2946bb9b4dc1dfe26608180e50d7ca8a8cff2a06"} Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.224307 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hr6hd" event={"ID":"f46a8d85-7384-4cdc-a19d-92a477bcc7d6","Type":"ContainerStarted","Data":"dce966aa1a6948d41d28e0906ec625492e4b552c28a098723ddeb90f85e762be"} Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.227871 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e46380b-794d-4d39-9f89-dd85644dda2b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.228113 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e46380b-794d-4d39-9f89-dd85644dda2b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.228239 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e46380b-794d-4d39-9f89-dd85644dda2b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.231596 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" event={"ID":"a2bd84be-1f01-47e4-a35e-4ed993d4be9b","Type":"ContainerDied","Data":"905c961ad3c43f06971f930404ad9ee6559939aaa0709a7f55dd23e5ac590dcb"} Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.231635 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="905c961ad3c43f06971f930404ad9ee6559939aaa0709a7f55dd23e5ac590dcb" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.231729 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.243702 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" event={"ID":"1882c06f-22f3-4346-8435-418f034f7d09","Type":"ContainerStarted","Data":"5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3"} Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.252641 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e46380b-794d-4d39-9f89-dd85644dda2b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.264843 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" podStartSLOduration=170.264819668 podStartE2EDuration="2m50.264819668s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:08.263702196 +0000 UTC m=+233.911398476" watchObservedRunningTime="2026-03-11 18:53:08.264819668 +0000 UTC m=+233.912515948" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.351414 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.481030 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-f9m6d" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.492109 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.509827 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:08 crc kubenswrapper[4842]: [-]has-synced failed: reason withheld Mar 11 18:53:08 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:08 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.510001 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.510174 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.574477 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 18:53:08 crc kubenswrapper[4842]: W0311 18:53:08.585098 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8e46380b_794d_4d39_9f89_dd85644dda2b.slice/crio-cb7412b226f957f0f0053eea5766190a2c6f46bf72d367d465d82b85cd53f533 WatchSource:0}: Error finding container cb7412b226f957f0f0053eea5766190a2c6f46bf72d367d465d82b85cd53f533: Status 404 returned error can't find the container with id cb7412b226f957f0f0053eea5766190a2c6f46bf72d367d465d82b85cd53f533 Mar 11 18:53:08 crc kubenswrapper[4842]: I0311 18:53:08.830289 4842 ???:1] "http: TLS handshake error from 192.168.126.11:46014: no serving certificate available for the kubelet" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.249646 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e46380b-794d-4d39-9f89-dd85644dda2b","Type":"ContainerStarted","Data":"654cd825ea972fb20d41cc78ac6cf04a7bed7f4e13deb2540fb7b25542b62a2d"} Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.249700 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e46380b-794d-4d39-9f89-dd85644dda2b","Type":"ContainerStarted","Data":"cb7412b226f957f0f0053eea5766190a2c6f46bf72d367d465d82b85cd53f533"} Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.252583 4842 generic.go:334] "Generic (PLEG): container finished" podID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerID="43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133" exitCode=0 Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.253931 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hr6hd" event={"ID":"f46a8d85-7384-4cdc-a19d-92a477bcc7d6","Type":"ContainerDied","Data":"43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133"} Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.253996 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.495905 4842 patch_prober.go:28] interesting pod/router-default-5444994796-xgzxx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 18:53:09 crc kubenswrapper[4842]: [+]has-synced ok Mar 11 18:53:09 crc kubenswrapper[4842]: [+]process-running ok Mar 11 18:53:09 crc kubenswrapper[4842]: healthz check failed Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.496231 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xgzxx" podUID="bab77e0c-a6e8-4e8b-a036-695cda94d7db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.529120 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.529763 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.532077 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.532304 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.547210 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.662724 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c2935ea-3767-4054-8842-4d32fd301ba4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.662823 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c2935ea-3767-4054-8842-4d32fd301ba4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.742894 4842 ???:1] "http: TLS handshake error from 192.168.126.11:46030: no serving certificate available for the kubelet" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.763780 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c2935ea-3767-4054-8842-4d32fd301ba4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.763852 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c2935ea-3767-4054-8842-4d32fd301ba4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.763887 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c2935ea-3767-4054-8842-4d32fd301ba4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.786427 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c2935ea-3767-4054-8842-4d32fd301ba4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:09 crc kubenswrapper[4842]: I0311 18:53:09.899581 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:10 crc kubenswrapper[4842]: I0311 18:53:10.186543 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 18:53:10 crc kubenswrapper[4842]: I0311 18:53:10.270918 4842 generic.go:334] "Generic (PLEG): container finished" podID="8e46380b-794d-4d39-9f89-dd85644dda2b" containerID="654cd825ea972fb20d41cc78ac6cf04a7bed7f4e13deb2540fb7b25542b62a2d" exitCode=0 Mar 11 18:53:10 crc kubenswrapper[4842]: I0311 18:53:10.271903 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e46380b-794d-4d39-9f89-dd85644dda2b","Type":"ContainerDied","Data":"654cd825ea972fb20d41cc78ac6cf04a7bed7f4e13deb2540fb7b25542b62a2d"} Mar 11 18:53:10 crc kubenswrapper[4842]: I0311 18:53:10.494933 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:53:10 crc kubenswrapper[4842]: I0311 18:53:10.497084 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xgzxx" Mar 11 18:53:14 crc kubenswrapper[4842]: I0311 18:53:14.014299 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4pbjx" Mar 11 18:53:16 crc kubenswrapper[4842]: I0311 18:53:16.760838 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mhz6l" Mar 11 18:53:18 crc kubenswrapper[4842]: I0311 18:53:18.128169 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dd97fc889-68h7m"] Mar 11 18:53:18 crc kubenswrapper[4842]: I0311 18:53:18.128876 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" podUID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" containerName="controller-manager" containerID="cri-o://c50ca3d842c66d701753afda09287ccb6b3978fa5243e9feafc8ccb895a39275" gracePeriod=30 Mar 11 18:53:18 crc kubenswrapper[4842]: I0311 18:53:18.207972 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:53:18 crc kubenswrapper[4842]: I0311 18:53:18.212013 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 18:53:19 crc kubenswrapper[4842]: I0311 18:53:19.339865 4842 generic.go:334] "Generic (PLEG): container finished" podID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" containerID="c50ca3d842c66d701753afda09287ccb6b3978fa5243e9feafc8ccb895a39275" exitCode=0 Mar 11 18:53:19 crc kubenswrapper[4842]: I0311 18:53:19.339955 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" event={"ID":"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb","Type":"ContainerDied","Data":"c50ca3d842c66d701753afda09287ccb6b3978fa5243e9feafc8ccb895a39275"} Mar 11 18:53:20 crc kubenswrapper[4842]: I0311 18:53:20.003835 4842 ???:1] "http: TLS handshake error from 192.168.126.11:52546: no serving certificate available for the kubelet" Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.557051 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.669698 4842 patch_prober.go:28] interesting pod/controller-manager-7dd97fc889-68h7m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.670256 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" podUID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.730706 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e46380b-794d-4d39-9f89-dd85644dda2b-kubelet-dir\") pod \"8e46380b-794d-4d39-9f89-dd85644dda2b\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.730872 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e46380b-794d-4d39-9f89-dd85644dda2b-kube-api-access\") pod \"8e46380b-794d-4d39-9f89-dd85644dda2b\" (UID: \"8e46380b-794d-4d39-9f89-dd85644dda2b\") " Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.730874 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e46380b-794d-4d39-9f89-dd85644dda2b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8e46380b-794d-4d39-9f89-dd85644dda2b" (UID: "8e46380b-794d-4d39-9f89-dd85644dda2b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.731195 4842 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e46380b-794d-4d39-9f89-dd85644dda2b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.755258 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e46380b-794d-4d39-9f89-dd85644dda2b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e46380b-794d-4d39-9f89-dd85644dda2b" (UID: "8e46380b-794d-4d39-9f89-dd85644dda2b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:21 crc kubenswrapper[4842]: I0311 18:53:21.832975 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e46380b-794d-4d39-9f89-dd85644dda2b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.136690 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.140030 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.152904 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7a00900-ec76-49e4-9485-131830a0611e-metrics-certs\") pod \"network-metrics-daemon-8vd7m\" (UID: \"a7a00900-ec76-49e4-9485-131830a0611e\") " pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.308252 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.316038 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8vd7m" Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.357301 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0c2935ea-3767-4054-8842-4d32fd301ba4","Type":"ContainerStarted","Data":"5f54d05073fb192ae47f293a6a2b100fd43d2f566dc3b1bde3e3bf14941455f6"} Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.358759 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8e46380b-794d-4d39-9f89-dd85644dda2b","Type":"ContainerDied","Data":"cb7412b226f957f0f0053eea5766190a2c6f46bf72d367d465d82b85cd53f533"} Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.358817 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb7412b226f957f0f0053eea5766190a2c6f46bf72d367d465d82b85cd53f533" Mar 11 18:53:22 crc kubenswrapper[4842]: I0311 18:53:22.358834 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 18:53:26 crc kubenswrapper[4842]: I0311 18:53:26.959850 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.472440 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.472871 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 18:53:31 crc kubenswrapper[4842]: E0311 18:53:31.670722 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 11 18:53:31 crc kubenswrapper[4842]: E0311 18:53:31.670893 4842 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 18:53:31 crc kubenswrapper[4842]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 11 18:53:31 crc kubenswrapper[4842]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kcfsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29554252-4qdmf_openshift-infra(30a9b79e-4043-4dc7-b625-53e0962a745b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 11 18:53:31 crc kubenswrapper[4842]: > logger="UnhandledError" Mar 11 18:53:31 crc kubenswrapper[4842]: E0311 18:53:31.672292 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" podUID="30a9b79e-4043-4dc7-b625-53e0962a745b" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.679903 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.717941 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77bdd5f885-kp7vg"] Mar 11 18:53:31 crc kubenswrapper[4842]: E0311 18:53:31.718215 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e46380b-794d-4d39-9f89-dd85644dda2b" containerName="pruner" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.718229 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e46380b-794d-4d39-9f89-dd85644dda2b" containerName="pruner" Mar 11 18:53:31 crc kubenswrapper[4842]: E0311 18:53:31.718247 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" containerName="controller-manager" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.718254 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" containerName="controller-manager" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.718365 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" containerName="controller-manager" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.718374 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e46380b-794d-4d39-9f89-dd85644dda2b" containerName="pruner" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.732217 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.745282 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77bdd5f885-kp7vg"] Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.798723 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-client-ca\") pod \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.799793 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hjf6\" (UniqueName: \"kubernetes.io/projected/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-kube-api-access-8hjf6\") pod \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.799955 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-config\") pod \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.800053 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-serving-cert\") pod \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.800162 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-proxy-ca-bundles\") pod \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\" (UID: \"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb\") " Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.799998 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" (UID: "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.801174 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" (UID: "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.803657 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-config" (OuterVolumeSpecName: "config") pod "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" (UID: "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.812318 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" (UID: "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.832212 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-kube-api-access-8hjf6" (OuterVolumeSpecName: "kube-api-access-8hjf6") pod "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" (UID: "7cc2bdb7-dfbb-446a-843b-3a08892bf1eb"). InnerVolumeSpecName "kube-api-access-8hjf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901392 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-proxy-ca-bundles\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901435 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-config\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901461 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6wcs\" (UniqueName: \"kubernetes.io/projected/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-kube-api-access-r6wcs\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901483 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-client-ca\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901713 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-serving-cert\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901909 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901936 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hjf6\" (UniqueName: \"kubernetes.io/projected/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-kube-api-access-8hjf6\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901951 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901964 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:31 crc kubenswrapper[4842]: I0311 18:53:31.901976 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.003319 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-proxy-ca-bundles\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.003392 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-config\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.003448 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6wcs\" (UniqueName: \"kubernetes.io/projected/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-kube-api-access-r6wcs\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.003489 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-client-ca\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.003563 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-serving-cert\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.004531 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-proxy-ca-bundles\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.004713 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-client-ca\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.004935 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-config\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.026714 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6wcs\" (UniqueName: \"kubernetes.io/projected/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-kube-api-access-r6wcs\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.026959 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-serving-cert\") pod \"controller-manager-77bdd5f885-kp7vg\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.087570 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.425710 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" event={"ID":"7cc2bdb7-dfbb-446a-843b-3a08892bf1eb","Type":"ContainerDied","Data":"be3b15c1a70e0f27d068d2a55f8b18932280a1e83dbaf1e0c3a7bebf2bbeb13a"} Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.425734 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.425815 4842 scope.go:117] "RemoveContainer" containerID="c50ca3d842c66d701753afda09287ccb6b3978fa5243e9feafc8ccb895a39275" Mar 11 18:53:32 crc kubenswrapper[4842]: E0311 18:53:32.427439 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" podUID="30a9b79e-4043-4dc7-b625-53e0962a745b" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.478152 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dd97fc889-68h7m"] Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.483862 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dd97fc889-68h7m"] Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.669219 4842 patch_prober.go:28] interesting pod/controller-manager-7dd97fc889-68h7m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.669301 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dd97fc889-68h7m" podUID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 18:53:32 crc kubenswrapper[4842]: I0311 18:53:32.970570 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc2bdb7-dfbb-446a-843b-3a08892bf1eb" path="/var/lib/kubelet/pods/7cc2bdb7-dfbb-446a-843b-3a08892bf1eb/volumes" Mar 11 18:53:33 crc kubenswrapper[4842]: I0311 18:53:33.433439 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-4hgzc_e05f0ae6-fd2d-44ed-968d-d2b66ec70f83/route-controller-manager/0.log" Mar 11 18:53:33 crc kubenswrapper[4842]: I0311 18:53:33.433513 4842 generic.go:334] "Generic (PLEG): container finished" podID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" containerID="da041c5ba19ee086032a67b90c60a3ec98da38c59e9b33d05749f35478c9ac09" exitCode=137 Mar 11 18:53:33 crc kubenswrapper[4842]: I0311 18:53:33.433560 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" event={"ID":"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83","Type":"ContainerDied","Data":"da041c5ba19ee086032a67b90c60a3ec98da38c59e9b33d05749f35478c9ac09"} Mar 11 18:53:33 crc kubenswrapper[4842]: E0311 18:53:33.469555 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 18:53:33 crc kubenswrapper[4842]: E0311 18:53:33.469886 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzkht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vb82w_openshift-marketplace(ff824009-ab02-4a23-9c8a-76bc3d6a5f04): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 18:53:33 crc kubenswrapper[4842]: E0311 18:53:33.472300 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vb82w" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" Mar 11 18:53:33 crc kubenswrapper[4842]: E0311 18:53:33.485000 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 18:53:33 crc kubenswrapper[4842]: E0311 18:53:33.485157 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gld7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d64lq_openshift-marketplace(a0ef18e4-d9a7-4122-89ed-b556ed419954): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 18:53:33 crc kubenswrapper[4842]: E0311 18:53:33.486360 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d64lq" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" Mar 11 18:53:35 crc kubenswrapper[4842]: E0311 18:53:35.116169 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vb82w" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" Mar 11 18:53:35 crc kubenswrapper[4842]: E0311 18:53:35.116651 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d64lq" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" Mar 11 18:53:35 crc kubenswrapper[4842]: E0311 18:53:35.216739 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 18:53:35 crc kubenswrapper[4842]: E0311 18:53:35.216917 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmnsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bg2xr_openshift-marketplace(7771ebaa-648a-46c4-986c-2cea25b5b7df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 18:53:35 crc kubenswrapper[4842]: E0311 18:53:35.218477 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bg2xr" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" Mar 11 18:53:38 crc kubenswrapper[4842]: I0311 18:53:38.126474 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77bdd5f885-kp7vg"] Mar 11 18:53:38 crc kubenswrapper[4842]: I0311 18:53:38.499438 4842 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4hgzc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 11 18:53:38 crc kubenswrapper[4842]: I0311 18:53:38.499493 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" podUID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 11 18:53:38 crc kubenswrapper[4842]: I0311 18:53:38.590226 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h8nlz" Mar 11 18:53:39 crc kubenswrapper[4842]: E0311 18:53:39.234292 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bg2xr" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" Mar 11 18:53:39 crc kubenswrapper[4842]: E0311 18:53:39.264994 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 11 18:53:39 crc kubenswrapper[4842]: E0311 18:53:39.265141 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckkkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hr6hd_openshift-marketplace(f46a8d85-7384-4cdc-a19d-92a477bcc7d6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 18:53:39 crc kubenswrapper[4842]: E0311 18:53:39.266324 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hr6hd" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.725585 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.726828 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.740533 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.826655 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.826962 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.927927 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.927981 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.928068 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:39 crc kubenswrapper[4842]: I0311 18:53:39.945227 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:40 crc kubenswrapper[4842]: I0311 18:53:40.063325 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:40 crc kubenswrapper[4842]: I0311 18:53:40.507360 4842 ???:1] "http: TLS handshake error from 192.168.126.11:41076: no serving certificate available for the kubelet" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.006530 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hr6hd" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.240598 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.240938 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5gmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7bslz_openshift-marketplace(60214716-6377-46b4-9c9e-adc90ffca659): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.242183 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7bslz" podUID="60214716-6377-46b4-9c9e-adc90ffca659" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.277781 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.278258 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9jr8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6rrvh_openshift-marketplace(13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.279810 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6rrvh" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.335993 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-4hgzc_e05f0ae6-fd2d-44ed-968d-d2b66ec70f83/route-controller-manager/0.log" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.336059 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.365159 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98"] Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.365657 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" containerName="route-controller-manager" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.365671 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" containerName="route-controller-manager" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.365841 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" containerName="route-controller-manager" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.366209 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.370438 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98"] Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.450479 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-serving-cert\") pod \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.450537 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4tnz\" (UniqueName: \"kubernetes.io/projected/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-kube-api-access-b4tnz\") pod \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.450642 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-client-ca\") pod \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.451115 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-config\") pod \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\" (UID: \"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83\") " Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.451230 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-config\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.451279 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-client-ca\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.451342 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-kube-api-access-k8flm\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.451433 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-serving-cert\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.451551 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-client-ca" (OuterVolumeSpecName: "client-ca") pod "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" (UID: "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.451931 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-config" (OuterVolumeSpecName: "config") pod "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" (UID: "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.456879 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-kube-api-access-b4tnz" (OuterVolumeSpecName: "kube-api-access-b4tnz") pod "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" (UID: "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83"). InnerVolumeSpecName "kube-api-access-b4tnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.456928 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" (UID: "e05f0ae6-fd2d-44ed-968d-d2b66ec70f83"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.490771 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-4hgzc_e05f0ae6-fd2d-44ed-968d-d2b66ec70f83/route-controller-manager/0.log" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.490878 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" event={"ID":"e05f0ae6-fd2d-44ed-968d-d2b66ec70f83","Type":"ContainerDied","Data":"415ac27cb7bccd09f23596f2cbe40adb213e04466cc836639ffe4ccd33edcf9f"} Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.490915 4842 scope.go:117] "RemoveContainer" containerID="da041c5ba19ee086032a67b90c60a3ec98da38c59e9b33d05749f35478c9ac09" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.490945 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.494324 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7bslz" podUID="60214716-6377-46b4-9c9e-adc90ffca659" Mar 11 18:53:41 crc kubenswrapper[4842]: E0311 18:53:41.494388 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6rrvh" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.543755 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc"] Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.547790 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4hgzc"] Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.552675 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77bdd5f885-kp7vg"] Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.552707 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-serving-cert\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.552787 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-client-ca\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.552829 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-config\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.552867 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-kube-api-access-k8flm\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.552974 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.552988 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4tnz\" (UniqueName: \"kubernetes.io/projected/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-kube-api-access-b4tnz\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.553001 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.553016 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.554624 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-client-ca\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.554900 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-config\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.559223 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-serving-cert\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.559370 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8vd7m"] Mar 11 18:53:41 crc kubenswrapper[4842]: W0311 18:53:41.560501 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab97406b_5f6d_4ffa_9271_a3c9ae1815c0.slice/crio-fff9e6882e30471b71d58ea191688e6101c7f749d6f0e55396ea38c95c9054bb WatchSource:0}: Error finding container fff9e6882e30471b71d58ea191688e6101c7f749d6f0e55396ea38c95c9054bb: Status 404 returned error can't find the container with id fff9e6882e30471b71d58ea191688e6101c7f749d6f0e55396ea38c95c9054bb Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.570716 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-kube-api-access-k8flm\") pod \"route-controller-manager-f6bb7f4ff-pwd98\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.596461 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 18:53:41 crc kubenswrapper[4842]: I0311 18:53:41.689772 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.504037 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"126f2b05-e8aa-4546-a425-ff77aa82a8c2","Type":"ContainerStarted","Data":"c743bf6b24b81283ce94f780766e879952035e02ad16504a1fbd02f9b1ab19f0"} Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.505516 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" event={"ID":"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0","Type":"ContainerStarted","Data":"3b3dcd9d4e92342bf1da6c192755885a182bffa40951267a3e30b830d9baed2f"} Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.505565 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" event={"ID":"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0","Type":"ContainerStarted","Data":"fff9e6882e30471b71d58ea191688e6101c7f749d6f0e55396ea38c95c9054bb"} Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.505747 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" podUID="ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" containerName="controller-manager" containerID="cri-o://3b3dcd9d4e92342bf1da6c192755885a182bffa40951267a3e30b830d9baed2f" gracePeriod=30 Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.506871 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.509963 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" event={"ID":"a7a00900-ec76-49e4-9485-131830a0611e","Type":"ContainerStarted","Data":"4130d6cb9ac4c388c1c9432350363ed006bcd8a4aaf097b02845cea053ad2196"} Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.510013 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" event={"ID":"a7a00900-ec76-49e4-9485-131830a0611e","Type":"ContainerStarted","Data":"d6e2c3939e8625615c8a3f89cafd7c0d2c12107de51227e7d0f17c99a2c8aeab"} Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.511656 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.512882 4842 generic.go:334] "Generic (PLEG): container finished" podID="0c2935ea-3767-4054-8842-4d32fd301ba4" containerID="99f00000430cdac101125419533b06344adf92ca20bd9b1bf02b5fd156e58994" exitCode=0 Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.512919 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0c2935ea-3767-4054-8842-4d32fd301ba4","Type":"ContainerDied","Data":"99f00000430cdac101125419533b06344adf92ca20bd9b1bf02b5fd156e58994"} Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.529470 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" podStartSLOduration=24.529453293 podStartE2EDuration="24.529453293s" podCreationTimestamp="2026-03-11 18:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:42.525036675 +0000 UTC m=+268.172732955" watchObservedRunningTime="2026-03-11 18:53:42.529453293 +0000 UTC m=+268.177149573" Mar 11 18:53:42 crc kubenswrapper[4842]: I0311 18:53:42.971404 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05f0ae6-fd2d-44ed-968d-d2b66ec70f83" path="/var/lib/kubelet/pods/e05f0ae6-fd2d-44ed-968d-d2b66ec70f83/volumes" Mar 11 18:53:43 crc kubenswrapper[4842]: I0311 18:53:43.519346 4842 generic.go:334] "Generic (PLEG): container finished" podID="ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" containerID="3b3dcd9d4e92342bf1da6c192755885a182bffa40951267a3e30b830d9baed2f" exitCode=0 Mar 11 18:53:43 crc kubenswrapper[4842]: I0311 18:53:43.519447 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" event={"ID":"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0","Type":"ContainerDied","Data":"3b3dcd9d4e92342bf1da6c192755885a182bffa40951267a3e30b830d9baed2f"} Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.721151 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.721910 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.728297 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.796218 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2484672-71b2-46e3-9ede-780d3e1aaafc-kube-api-access\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.796327 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-var-lock\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.796395 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.897401 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2484672-71b2-46e3-9ede-780d3e1aaafc-kube-api-access\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.897484 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-var-lock\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.897530 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.897612 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-var-lock\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.897664 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:44 crc kubenswrapper[4842]: I0311 18:53:44.915541 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2484672-71b2-46e3-9ede-780d3e1aaafc-kube-api-access\") pod \"installer-9-crc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:45 crc kubenswrapper[4842]: I0311 18:53:45.047439 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:53:48 crc kubenswrapper[4842]: I0311 18:53:48.935392 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.050923 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c2935ea-3767-4054-8842-4d32fd301ba4-kubelet-dir\") pod \"0c2935ea-3767-4054-8842-4d32fd301ba4\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.050975 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c2935ea-3767-4054-8842-4d32fd301ba4-kube-api-access\") pod \"0c2935ea-3767-4054-8842-4d32fd301ba4\" (UID: \"0c2935ea-3767-4054-8842-4d32fd301ba4\") " Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.051687 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c2935ea-3767-4054-8842-4d32fd301ba4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0c2935ea-3767-4054-8842-4d32fd301ba4" (UID: "0c2935ea-3767-4054-8842-4d32fd301ba4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.062003 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2935ea-3767-4054-8842-4d32fd301ba4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0c2935ea-3767-4054-8842-4d32fd301ba4" (UID: "0c2935ea-3767-4054-8842-4d32fd301ba4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.120004 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.152491 4842 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c2935ea-3767-4054-8842-4d32fd301ba4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.152539 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c2935ea-3767-4054-8842-4d32fd301ba4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.255564 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-proxy-ca-bundles\") pod \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.255634 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-client-ca\") pod \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.255665 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-config\") pod \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.255693 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-serving-cert\") pod \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.255774 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6wcs\" (UniqueName: \"kubernetes.io/projected/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-kube-api-access-r6wcs\") pod \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\" (UID: \"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0\") " Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.256979 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" (UID: "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.257038 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-config" (OuterVolumeSpecName: "config") pod "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" (UID: "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.257096 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" (UID: "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.260130 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" (UID: "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.260173 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-kube-api-access-r6wcs" (OuterVolumeSpecName: "kube-api-access-r6wcs") pod "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" (UID: "ab97406b-5f6d-4ffa-9271-a3c9ae1815c0"). InnerVolumeSpecName "kube-api-access-r6wcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.309887 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.318296 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98"] Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.358026 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.358064 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.358077 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.358109 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.358122 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6wcs\" (UniqueName: \"kubernetes.io/projected/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0-kube-api-access-r6wcs\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.558429 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" event={"ID":"ab97406b-5f6d-4ffa-9271-a3c9ae1815c0","Type":"ContainerDied","Data":"fff9e6882e30471b71d58ea191688e6101c7f749d6f0e55396ea38c95c9054bb"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.558716 4842 scope.go:117] "RemoveContainer" containerID="3b3dcd9d4e92342bf1da6c192755885a182bffa40951267a3e30b830d9baed2f" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.558460 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77bdd5f885-kp7vg" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.563143 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvszz" event={"ID":"7857f1af-d426-446f-a295-05423f407554","Type":"ContainerStarted","Data":"af59d540f2aea45554f84a18d24024d2d5956499039d0121360a1d8ce98544e5"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.565284 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8vd7m" event={"ID":"a7a00900-ec76-49e4-9485-131830a0611e","Type":"ContainerStarted","Data":"7596bd70e5533af61ba3b9cf73c5f7915c8ff058750c73aaf490b4ce758a910e"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.566629 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f2484672-71b2-46e3-9ede-780d3e1aaafc","Type":"ContainerStarted","Data":"f292ee96cf61fc1cb5a75c6659fcaab1de9d9fbfa199ed3627e69b48e2eacf32"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.568133 4842 generic.go:334] "Generic (PLEG): container finished" podID="22097394-ae90-446c-9114-14a1f1d184bb" containerID="3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d" exitCode=0 Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.568178 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvmzx" event={"ID":"22097394-ae90-446c-9114-14a1f1d184bb","Type":"ContainerDied","Data":"3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.569470 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" event={"ID":"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005","Type":"ContainerStarted","Data":"05d49a8823c82976c7ec08e604d4c72b513d8b7f08ae6f80d8c73fd88a82cefa"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.569496 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" event={"ID":"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005","Type":"ContainerStarted","Data":"cff9336d9f942728cc90f27160878abd2cbb447c2b5cf9099a3c85fff86654e9"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.569698 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.571334 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0c2935ea-3767-4054-8842-4d32fd301ba4","Type":"ContainerDied","Data":"5f54d05073fb192ae47f293a6a2b100fd43d2f566dc3b1bde3e3bf14941455f6"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.571437 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f54d05073fb192ae47f293a6a2b100fd43d2f566dc3b1bde3e3bf14941455f6" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.571543 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.574183 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"126f2b05-e8aa-4546-a425-ff77aa82a8c2","Type":"ContainerStarted","Data":"0be4a36e049cac7e10986340e24c3b647914026461364e6ef1f22511910e246e"} Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.599469 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" podStartSLOduration=11.59942686 podStartE2EDuration="11.59942686s" podCreationTimestamp="2026-03-11 18:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:49.596745024 +0000 UTC m=+275.244441304" watchObservedRunningTime="2026-03-11 18:53:49.59942686 +0000 UTC m=+275.247123150" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.630358 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=10.630339077 podStartE2EDuration="10.630339077s" podCreationTimestamp="2026-03-11 18:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:49.62399844 +0000 UTC m=+275.271694720" watchObservedRunningTime="2026-03-11 18:53:49.630339077 +0000 UTC m=+275.278035367" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.641947 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77bdd5f885-kp7vg"] Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.645911 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77bdd5f885-kp7vg"] Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.653346 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8vd7m" podStartSLOduration=211.65333135 podStartE2EDuration="3m31.65333135s" podCreationTimestamp="2026-03-11 18:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:49.652265452 +0000 UTC m=+275.299961742" watchObservedRunningTime="2026-03-11 18:53:49.65333135 +0000 UTC m=+275.301027630" Mar 11 18:53:49 crc kubenswrapper[4842]: I0311 18:53:49.770402 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.571992 4842 csr.go:261] certificate signing request csr-s5x2m is approved, waiting to be issued Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.581452 4842 generic.go:334] "Generic (PLEG): container finished" podID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerID="a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6" exitCode=0 Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.581507 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb82w" event={"ID":"ff824009-ab02-4a23-9c8a-76bc3d6a5f04","Type":"ContainerDied","Data":"a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6"} Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.582353 4842 csr.go:257] certificate signing request csr-s5x2m is issued Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.583938 4842 generic.go:334] "Generic (PLEG): container finished" podID="7857f1af-d426-446f-a295-05423f407554" containerID="af59d540f2aea45554f84a18d24024d2d5956499039d0121360a1d8ce98544e5" exitCode=0 Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.584006 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvszz" event={"ID":"7857f1af-d426-446f-a295-05423f407554","Type":"ContainerDied","Data":"af59d540f2aea45554f84a18d24024d2d5956499039d0121360a1d8ce98544e5"} Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.587585 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f2484672-71b2-46e3-9ede-780d3e1aaafc","Type":"ContainerStarted","Data":"70b3e586c0b04c5256f011269af604504923cfcdb75443fb6e44ebbb04e68bb5"} Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.589286 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvmzx" event={"ID":"22097394-ae90-446c-9114-14a1f1d184bb","Type":"ContainerStarted","Data":"e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e"} Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.591766 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" event={"ID":"30a9b79e-4043-4dc7-b625-53e0962a745b","Type":"ContainerStarted","Data":"53625180e2224384877ad13da1cdb960582d8b35a0042f9e0bfc69a0f3b11fce"} Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.595428 4842 generic.go:334] "Generic (PLEG): container finished" podID="126f2b05-e8aa-4546-a425-ff77aa82a8c2" containerID="0be4a36e049cac7e10986340e24c3b647914026461364e6ef1f22511910e246e" exitCode=0 Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.595483 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"126f2b05-e8aa-4546-a425-ff77aa82a8c2","Type":"ContainerDied","Data":"0be4a36e049cac7e10986340e24c3b647914026461364e6ef1f22511910e246e"} Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.645910 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" podStartSLOduration=62.35223922 podStartE2EDuration="1m50.645890239s" podCreationTimestamp="2026-03-11 18:52:00 +0000 UTC" firstStartedPulling="2026-03-11 18:53:01.735578301 +0000 UTC m=+227.383274581" lastFinishedPulling="2026-03-11 18:53:50.02922933 +0000 UTC m=+275.676925600" observedRunningTime="2026-03-11 18:53:50.642140115 +0000 UTC m=+276.289836405" watchObservedRunningTime="2026-03-11 18:53:50.645890239 +0000 UTC m=+276.293586519" Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.660918 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.660895086 podStartE2EDuration="6.660895086s" podCreationTimestamp="2026-03-11 18:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:50.65737008 +0000 UTC m=+276.305066360" watchObservedRunningTime="2026-03-11 18:53:50.660895086 +0000 UTC m=+276.308591366" Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.692502 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wvmzx" podStartSLOduration=1.810007595 podStartE2EDuration="44.692487828s" podCreationTimestamp="2026-03-11 18:53:06 +0000 UTC" firstStartedPulling="2026-03-11 18:53:07.207310144 +0000 UTC m=+232.855006424" lastFinishedPulling="2026-03-11 18:53:50.089790357 +0000 UTC m=+275.737486657" observedRunningTime="2026-03-11 18:53:50.689962587 +0000 UTC m=+276.337658867" watchObservedRunningTime="2026-03-11 18:53:50.692487828 +0000 UTC m=+276.340184108" Mar 11 18:53:50 crc kubenswrapper[4842]: I0311 18:53:50.977830 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" path="/var/lib/kubelet/pods/ab97406b-5f6d-4ffa-9271-a3c9ae1815c0/volumes" Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.590967 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-29 04:36:08.274200007 +0000 UTC Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.591053 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7017h42m16.683151815s for next certificate rotation Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.606318 4842 generic.go:334] "Generic (PLEG): container finished" podID="30a9b79e-4043-4dc7-b625-53e0962a745b" containerID="53625180e2224384877ad13da1cdb960582d8b35a0042f9e0bfc69a0f3b11fce" exitCode=0 Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.606439 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" event={"ID":"30a9b79e-4043-4dc7-b625-53e0962a745b","Type":"ContainerDied","Data":"53625180e2224384877ad13da1cdb960582d8b35a0042f9e0bfc69a0f3b11fce"} Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.609718 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb82w" event={"ID":"ff824009-ab02-4a23-9c8a-76bc3d6a5f04","Type":"ContainerStarted","Data":"371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0"} Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.612479 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvszz" event={"ID":"7857f1af-d426-446f-a295-05423f407554","Type":"ContainerStarted","Data":"a50cd1d4365c3d5e1b721201d2640c6d833ccf538c623ca1d7ca0f336fa6be54"} Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.644785 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvszz" podStartSLOduration=2.866827244 podStartE2EDuration="45.644765565s" podCreationTimestamp="2026-03-11 18:53:06 +0000 UTC" firstStartedPulling="2026-03-11 18:53:08.222030416 +0000 UTC m=+233.869726696" lastFinishedPulling="2026-03-11 18:53:50.999968727 +0000 UTC m=+276.647665017" observedRunningTime="2026-03-11 18:53:51.644683362 +0000 UTC m=+277.292379642" watchObservedRunningTime="2026-03-11 18:53:51.644765565 +0000 UTC m=+277.292461835" Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.665383 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vb82w" podStartSLOduration=3.72315434 podStartE2EDuration="48.665355642s" podCreationTimestamp="2026-03-11 18:53:03 +0000 UTC" firstStartedPulling="2026-03-11 18:53:06.182887985 +0000 UTC m=+231.830584265" lastFinishedPulling="2026-03-11 18:53:51.125089287 +0000 UTC m=+276.772785567" observedRunningTime="2026-03-11 18:53:51.662651395 +0000 UTC m=+277.310347675" watchObservedRunningTime="2026-03-11 18:53:51.665355642 +0000 UTC m=+277.313051922" Mar 11 18:53:51 crc kubenswrapper[4842]: I0311 18:53:51.890779 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.002705 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kube-api-access\") pod \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.002962 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kubelet-dir\") pod \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\" (UID: \"126f2b05-e8aa-4546-a425-ff77aa82a8c2\") " Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.003076 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "126f2b05-e8aa-4546-a425-ff77aa82a8c2" (UID: "126f2b05-e8aa-4546-a425-ff77aa82a8c2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.003472 4842 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.009442 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "126f2b05-e8aa-4546-a425-ff77aa82a8c2" (UID: "126f2b05-e8aa-4546-a425-ff77aa82a8c2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.104254 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/126f2b05-e8aa-4546-a425-ff77aa82a8c2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.591624 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-21 07:15:01.327763752 +0000 UTC Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.592058 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6108h21m8.735709748s for next certificate rotation Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.619038 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg2xr" event={"ID":"7771ebaa-648a-46c4-986c-2cea25b5b7df","Type":"ContainerStarted","Data":"8f13e05c9a9edcb02f07b48ac2f49aec6281ee35d8867c0f8c2be887e237e694"} Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.620411 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"126f2b05-e8aa-4546-a425-ff77aa82a8c2","Type":"ContainerDied","Data":"c743bf6b24b81283ce94f780766e879952035e02ad16504a1fbd02f9b1ab19f0"} Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.620515 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.620441 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c743bf6b24b81283ce94f780766e879952035e02ad16504a1fbd02f9b1ab19f0" Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.622352 4842 generic.go:334] "Generic (PLEG): container finished" podID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerID="9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325" exitCode=0 Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.622438 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d64lq" event={"ID":"a0ef18e4-d9a7-4122-89ed-b556ed419954","Type":"ContainerDied","Data":"9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325"} Mar 11 18:53:52 crc kubenswrapper[4842]: I0311 18:53:52.841723 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.014341 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcfsz\" (UniqueName: \"kubernetes.io/projected/30a9b79e-4043-4dc7-b625-53e0962a745b-kube-api-access-kcfsz\") pod \"30a9b79e-4043-4dc7-b625-53e0962a745b\" (UID: \"30a9b79e-4043-4dc7-b625-53e0962a745b\") " Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.018708 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a9b79e-4043-4dc7-b625-53e0962a745b-kube-api-access-kcfsz" (OuterVolumeSpecName: "kube-api-access-kcfsz") pod "30a9b79e-4043-4dc7-b625-53e0962a745b" (UID: "30a9b79e-4043-4dc7-b625-53e0962a745b"). InnerVolumeSpecName "kube-api-access-kcfsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.064772 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-595f5986cd-ttkl4"] Mar 11 18:53:53 crc kubenswrapper[4842]: E0311 18:53:53.065003 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2935ea-3767-4054-8842-4d32fd301ba4" containerName="pruner" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065017 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2935ea-3767-4054-8842-4d32fd301ba4" containerName="pruner" Mar 11 18:53:53 crc kubenswrapper[4842]: E0311 18:53:53.065032 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" containerName="controller-manager" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065039 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" containerName="controller-manager" Mar 11 18:53:53 crc kubenswrapper[4842]: E0311 18:53:53.065055 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126f2b05-e8aa-4546-a425-ff77aa82a8c2" containerName="pruner" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065064 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="126f2b05-e8aa-4546-a425-ff77aa82a8c2" containerName="pruner" Mar 11 18:53:53 crc kubenswrapper[4842]: E0311 18:53:53.065077 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a9b79e-4043-4dc7-b625-53e0962a745b" containerName="oc" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065085 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a9b79e-4043-4dc7-b625-53e0962a745b" containerName="oc" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065203 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a9b79e-4043-4dc7-b625-53e0962a745b" containerName="oc" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065224 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2935ea-3767-4054-8842-4d32fd301ba4" containerName="pruner" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065232 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="126f2b05-e8aa-4546-a425-ff77aa82a8c2" containerName="pruner" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065247 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab97406b-5f6d-4ffa-9271-a3c9ae1815c0" containerName="controller-manager" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.065744 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.068218 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.068609 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.069836 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.070085 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.071640 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.071996 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.075002 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.076325 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-595f5986cd-ttkl4"] Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.116314 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcfsz\" (UniqueName: \"kubernetes.io/projected/30a9b79e-4043-4dc7-b625-53e0962a745b-kube-api-access-kcfsz\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.217938 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-config\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.218081 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-serving-cert\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.218171 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-client-ca\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.218382 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-proxy-ca-bundles\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.218410 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdm95\" (UniqueName: \"kubernetes.io/projected/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-kube-api-access-rdm95\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.318870 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-proxy-ca-bundles\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.318923 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdm95\" (UniqueName: \"kubernetes.io/projected/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-kube-api-access-rdm95\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.318981 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-config\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.319015 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-serving-cert\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.319043 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-client-ca\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.319969 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-client-ca\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.320257 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-config\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.320337 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-proxy-ca-bundles\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.322871 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-serving-cert\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.334861 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdm95\" (UniqueName: \"kubernetes.io/projected/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-kube-api-access-rdm95\") pod \"controller-manager-595f5986cd-ttkl4\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.384741 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.569468 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-595f5986cd-ttkl4"] Mar 11 18:53:53 crc kubenswrapper[4842]: W0311 18:53:53.579318 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f7f8fd_0d14_4fd9_8e5c_cb170633746b.slice/crio-4c2cab2e8ed0b29eaa88ecc4b86e05f6899835c2abc596cc93cbb0e570b976a9 WatchSource:0}: Error finding container 4c2cab2e8ed0b29eaa88ecc4b86e05f6899835c2abc596cc93cbb0e570b976a9: Status 404 returned error can't find the container with id 4c2cab2e8ed0b29eaa88ecc4b86e05f6899835c2abc596cc93cbb0e570b976a9 Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.643685 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d64lq" event={"ID":"a0ef18e4-d9a7-4122-89ed-b556ed419954","Type":"ContainerStarted","Data":"816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8"} Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.644828 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" event={"ID":"98f7f8fd-0d14-4fd9-8e5c-cb170633746b","Type":"ContainerStarted","Data":"4c2cab2e8ed0b29eaa88ecc4b86e05f6899835c2abc596cc93cbb0e570b976a9"} Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.646761 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" event={"ID":"30a9b79e-4043-4dc7-b625-53e0962a745b","Type":"ContainerDied","Data":"73fe1dddee717ceb6ffe0c2bb9279ed21bc22a67a33ef29a7d6be8bcd7151c55"} Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.646813 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73fe1dddee717ceb6ffe0c2bb9279ed21bc22a67a33ef29a7d6be8bcd7151c55" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.646959 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554252-4qdmf" Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.650884 4842 generic.go:334] "Generic (PLEG): container finished" podID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerID="8f13e05c9a9edcb02f07b48ac2f49aec6281ee35d8867c0f8c2be887e237e694" exitCode=0 Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.650938 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg2xr" event={"ID":"7771ebaa-648a-46c4-986c-2cea25b5b7df","Type":"ContainerDied","Data":"8f13e05c9a9edcb02f07b48ac2f49aec6281ee35d8867c0f8c2be887e237e694"} Mar 11 18:53:53 crc kubenswrapper[4842]: I0311 18:53:53.674439 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d64lq" podStartSLOduration=2.530119946 podStartE2EDuration="49.674423448s" podCreationTimestamp="2026-03-11 18:53:04 +0000 UTC" firstStartedPulling="2026-03-11 18:53:06.182985488 +0000 UTC m=+231.830681768" lastFinishedPulling="2026-03-11 18:53:53.32728899 +0000 UTC m=+278.974985270" observedRunningTime="2026-03-11 18:53:53.669497442 +0000 UTC m=+279.317193732" watchObservedRunningTime="2026-03-11 18:53:53.674423448 +0000 UTC m=+279.322119728" Mar 11 18:53:54 crc kubenswrapper[4842]: I0311 18:53:54.318905 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:54 crc kubenswrapper[4842]: I0311 18:53:54.318965 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:54 crc kubenswrapper[4842]: I0311 18:53:54.656553 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" event={"ID":"98f7f8fd-0d14-4fd9-8e5c-cb170633746b","Type":"ContainerStarted","Data":"2520b1c731643e4f85b1643adfcce67bbf909115f2de0b9e51555cc1aea96e86"} Mar 11 18:53:54 crc kubenswrapper[4842]: I0311 18:53:54.656990 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:54 crc kubenswrapper[4842]: I0311 18:53:54.661742 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:54 crc kubenswrapper[4842]: I0311 18:53:54.677050 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" podStartSLOduration=16.677027828 podStartE2EDuration="16.677027828s" podCreationTimestamp="2026-03-11 18:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:53:54.67431943 +0000 UTC m=+280.322015730" watchObservedRunningTime="2026-03-11 18:53:54.677027828 +0000 UTC m=+280.324724138" Mar 11 18:53:54 crc kubenswrapper[4842]: I0311 18:53:54.686653 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:53:55 crc kubenswrapper[4842]: I0311 18:53:55.024125 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:55 crc kubenswrapper[4842]: I0311 18:53:55.024480 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:53:56 crc kubenswrapper[4842]: I0311 18:53:56.059224 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-d64lq" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="registry-server" probeResult="failure" output=< Mar 11 18:53:56 crc kubenswrapper[4842]: timeout: failed to connect service ":50051" within 1s Mar 11 18:53:56 crc kubenswrapper[4842]: > Mar 11 18:53:56 crc kubenswrapper[4842]: I0311 18:53:56.642233 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:56 crc kubenswrapper[4842]: I0311 18:53:56.643024 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:56 crc kubenswrapper[4842]: I0311 18:53:56.684104 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:56 crc kubenswrapper[4842]: I0311 18:53:56.724431 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:53:57 crc kubenswrapper[4842]: I0311 18:53:57.237637 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:57 crc kubenswrapper[4842]: I0311 18:53:57.237979 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:53:57 crc kubenswrapper[4842]: I0311 18:53:57.690100 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg2xr" event={"ID":"7771ebaa-648a-46c4-986c-2cea25b5b7df","Type":"ContainerStarted","Data":"962be4f0ab85aa58ee0681d0cf5db1d1f3f7d476fd6e64c4fbd5be31d60cf079"} Mar 11 18:53:57 crc kubenswrapper[4842]: I0311 18:53:57.709575 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bg2xr" podStartSLOduration=4.509981484 podStartE2EDuration="54.709555459s" podCreationTimestamp="2026-03-11 18:53:03 +0000 UTC" firstStartedPulling="2026-03-11 18:53:06.156347427 +0000 UTC m=+231.804043707" lastFinishedPulling="2026-03-11 18:53:56.355921392 +0000 UTC m=+282.003617682" observedRunningTime="2026-03-11 18:53:57.706240421 +0000 UTC m=+283.353936731" watchObservedRunningTime="2026-03-11 18:53:57.709555459 +0000 UTC m=+283.357251739" Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.127328 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-595f5986cd-ttkl4"] Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.127541 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" podUID="98f7f8fd-0d14-4fd9-8e5c-cb170633746b" containerName="controller-manager" containerID="cri-o://2520b1c731643e4f85b1643adfcce67bbf909115f2de0b9e51555cc1aea96e86" gracePeriod=30 Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.147763 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98"] Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.147965 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" podUID="ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" containerName="route-controller-manager" containerID="cri-o://05d49a8823c82976c7ec08e604d4c72b513d8b7f08ae6f80d8c73fd88a82cefa" gracePeriod=30 Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.281191 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvszz" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="registry-server" probeResult="failure" output=< Mar 11 18:53:58 crc kubenswrapper[4842]: timeout: failed to connect service ":50051" within 1s Mar 11 18:53:58 crc kubenswrapper[4842]: > Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.697181 4842 generic.go:334] "Generic (PLEG): container finished" podID="98f7f8fd-0d14-4fd9-8e5c-cb170633746b" containerID="2520b1c731643e4f85b1643adfcce67bbf909115f2de0b9e51555cc1aea96e86" exitCode=0 Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.697246 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" event={"ID":"98f7f8fd-0d14-4fd9-8e5c-cb170633746b","Type":"ContainerDied","Data":"2520b1c731643e4f85b1643adfcce67bbf909115f2de0b9e51555cc1aea96e86"} Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.698991 4842 generic.go:334] "Generic (PLEG): container finished" podID="ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" containerID="05d49a8823c82976c7ec08e604d4c72b513d8b7f08ae6f80d8c73fd88a82cefa" exitCode=0 Mar 11 18:53:58 crc kubenswrapper[4842]: I0311 18:53:58.699185 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" event={"ID":"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005","Type":"ContainerDied","Data":"05d49a8823c82976c7ec08e604d4c72b513d8b7f08ae6f80d8c73fd88a82cefa"} Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.000121 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvmzx"] Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.245962 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.250705 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.281958 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8"] Mar 11 18:53:59 crc kubenswrapper[4842]: E0311 18:53:59.282260 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f7f8fd-0d14-4fd9-8e5c-cb170633746b" containerName="controller-manager" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.282319 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f7f8fd-0d14-4fd9-8e5c-cb170633746b" containerName="controller-manager" Mar 11 18:53:59 crc kubenswrapper[4842]: E0311 18:53:59.282334 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" containerName="route-controller-manager" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.282341 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" containerName="route-controller-manager" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.282479 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f7f8fd-0d14-4fd9-8e5c-cb170633746b" containerName="controller-manager" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.282491 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" containerName="route-controller-manager" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.282864 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.290334 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8"] Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401151 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-config\") pod \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401246 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-client-ca\") pod \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401320 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-config\") pod \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401347 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-client-ca\") pod \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401366 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-serving-cert\") pod \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401452 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-kube-api-access-k8flm\") pod \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401523 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-serving-cert\") pod \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\" (UID: \"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401539 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-proxy-ca-bundles\") pod \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401559 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdm95\" (UniqueName: \"kubernetes.io/projected/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-kube-api-access-rdm95\") pod \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\" (UID: \"98f7f8fd-0d14-4fd9-8e5c-cb170633746b\") " Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401790 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45sgl\" (UniqueName: \"kubernetes.io/projected/693bba87-d346-4a84-8289-274e437065d0-kube-api-access-45sgl\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401842 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-config\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401862 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-client-ca\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.401999 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693bba87-d346-4a84-8289-274e437065d0-serving-cert\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.402704 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "98f7f8fd-0d14-4fd9-8e5c-cb170633746b" (UID: "98f7f8fd-0d14-4fd9-8e5c-cb170633746b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.402939 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-config" (OuterVolumeSpecName: "config") pod "98f7f8fd-0d14-4fd9-8e5c-cb170633746b" (UID: "98f7f8fd-0d14-4fd9-8e5c-cb170633746b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.403407 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-client-ca" (OuterVolumeSpecName: "client-ca") pod "98f7f8fd-0d14-4fd9-8e5c-cb170633746b" (UID: "98f7f8fd-0d14-4fd9-8e5c-cb170633746b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.403418 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-config" (OuterVolumeSpecName: "config") pod "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" (UID: "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.403396 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" (UID: "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.407523 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-kube-api-access-rdm95" (OuterVolumeSpecName: "kube-api-access-rdm95") pod "98f7f8fd-0d14-4fd9-8e5c-cb170633746b" (UID: "98f7f8fd-0d14-4fd9-8e5c-cb170633746b"). InnerVolumeSpecName "kube-api-access-rdm95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.407668 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" (UID: "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.407717 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "98f7f8fd-0d14-4fd9-8e5c-cb170633746b" (UID: "98f7f8fd-0d14-4fd9-8e5c-cb170633746b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.407845 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-kube-api-access-k8flm" (OuterVolumeSpecName: "kube-api-access-k8flm") pod "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" (UID: "ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005"). InnerVolumeSpecName "kube-api-access-k8flm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503098 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45sgl\" (UniqueName: \"kubernetes.io/projected/693bba87-d346-4a84-8289-274e437065d0-kube-api-access-45sgl\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503155 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-config\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503176 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-client-ca\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503242 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693bba87-d346-4a84-8289-274e437065d0-serving-cert\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503322 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503336 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503348 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503362 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8flm\" (UniqueName: \"kubernetes.io/projected/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-kube-api-access-k8flm\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503373 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503384 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503395 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdm95\" (UniqueName: \"kubernetes.io/projected/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-kube-api-access-rdm95\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503406 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.503416 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98f7f8fd-0d14-4fd9-8e5c-cb170633746b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.505719 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-client-ca\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.506475 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693bba87-d346-4a84-8289-274e437065d0-serving-cert\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.506524 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-config\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.521812 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45sgl\" (UniqueName: \"kubernetes.io/projected/693bba87-d346-4a84-8289-274e437065d0-kube-api-access-45sgl\") pod \"route-controller-manager-7c9865db89-k4tf8\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.617583 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.707674 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" event={"ID":"ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005","Type":"ContainerDied","Data":"cff9336d9f942728cc90f27160878abd2cbb447c2b5cf9099a3c85fff86654e9"} Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.707688 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.707974 4842 scope.go:117] "RemoveContainer" containerID="05d49a8823c82976c7ec08e604d4c72b513d8b7f08ae6f80d8c73fd88a82cefa" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.709591 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.709606 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-595f5986cd-ttkl4" event={"ID":"98f7f8fd-0d14-4fd9-8e5c-cb170633746b","Type":"ContainerDied","Data":"4c2cab2e8ed0b29eaa88ecc4b86e05f6899835c2abc596cc93cbb0e570b976a9"} Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.709711 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wvmzx" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="registry-server" containerID="cri-o://e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e" gracePeriod=2 Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.757126 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98"] Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.759634 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f6bb7f4ff-pwd98"] Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.768007 4842 scope.go:117] "RemoveContainer" containerID="2520b1c731643e4f85b1643adfcce67bbf909115f2de0b9e51555cc1aea96e86" Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.770339 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-595f5986cd-ttkl4"] Mar 11 18:53:59 crc kubenswrapper[4842]: I0311 18:53:59.776746 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-595f5986cd-ttkl4"] Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.003187 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8"] Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.133813 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554254-vzflg"] Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.134494 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554254-vzflg" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.138549 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.138897 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.138966 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.140643 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554254-vzflg"] Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.176417 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.313904 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-catalog-content\") pod \"22097394-ae90-446c-9114-14a1f1d184bb\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.314598 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-utilities\") pod \"22097394-ae90-446c-9114-14a1f1d184bb\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.314667 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlggr\" (UniqueName: \"kubernetes.io/projected/22097394-ae90-446c-9114-14a1f1d184bb-kube-api-access-rlggr\") pod \"22097394-ae90-446c-9114-14a1f1d184bb\" (UID: \"22097394-ae90-446c-9114-14a1f1d184bb\") " Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.314915 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl67g\" (UniqueName: \"kubernetes.io/projected/842d8359-baaa-48cc-b80f-28a6e0045e8b-kube-api-access-bl67g\") pod \"auto-csr-approver-29554254-vzflg\" (UID: \"842d8359-baaa-48cc-b80f-28a6e0045e8b\") " pod="openshift-infra/auto-csr-approver-29554254-vzflg" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.315346 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-utilities" (OuterVolumeSpecName: "utilities") pod "22097394-ae90-446c-9114-14a1f1d184bb" (UID: "22097394-ae90-446c-9114-14a1f1d184bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.323568 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22097394-ae90-446c-9114-14a1f1d184bb-kube-api-access-rlggr" (OuterVolumeSpecName: "kube-api-access-rlggr") pod "22097394-ae90-446c-9114-14a1f1d184bb" (UID: "22097394-ae90-446c-9114-14a1f1d184bb"). InnerVolumeSpecName "kube-api-access-rlggr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.362245 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22097394-ae90-446c-9114-14a1f1d184bb" (UID: "22097394-ae90-446c-9114-14a1f1d184bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.416853 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl67g\" (UniqueName: \"kubernetes.io/projected/842d8359-baaa-48cc-b80f-28a6e0045e8b-kube-api-access-bl67g\") pod \"auto-csr-approver-29554254-vzflg\" (UID: \"842d8359-baaa-48cc-b80f-28a6e0045e8b\") " pod="openshift-infra/auto-csr-approver-29554254-vzflg" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.417002 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.417015 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlggr\" (UniqueName: \"kubernetes.io/projected/22097394-ae90-446c-9114-14a1f1d184bb-kube-api-access-rlggr\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.417026 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22097394-ae90-446c-9114-14a1f1d184bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.436631 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl67g\" (UniqueName: \"kubernetes.io/projected/842d8359-baaa-48cc-b80f-28a6e0045e8b-kube-api-access-bl67g\") pod \"auto-csr-approver-29554254-vzflg\" (UID: \"842d8359-baaa-48cc-b80f-28a6e0045e8b\") " pod="openshift-infra/auto-csr-approver-29554254-vzflg" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.474212 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554254-vzflg" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.721211 4842 generic.go:334] "Generic (PLEG): container finished" podID="22097394-ae90-446c-9114-14a1f1d184bb" containerID="e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e" exitCode=0 Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.721361 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvmzx" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.721389 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvmzx" event={"ID":"22097394-ae90-446c-9114-14a1f1d184bb","Type":"ContainerDied","Data":"e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e"} Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.724679 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvmzx" event={"ID":"22097394-ae90-446c-9114-14a1f1d184bb","Type":"ContainerDied","Data":"93d01703a7e129f0d05259cad617199bf872d703db62fef499d93c075bf2dc14"} Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.724716 4842 scope.go:117] "RemoveContainer" containerID="e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.729856 4842 generic.go:334] "Generic (PLEG): container finished" podID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerID="3a0a4879a9940bb81bb8bdae6954db57043686dc5a432878a73c8edc4dc7b818" exitCode=0 Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.729966 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rrvh" event={"ID":"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7","Type":"ContainerDied","Data":"3a0a4879a9940bb81bb8bdae6954db57043686dc5a432878a73c8edc4dc7b818"} Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.733912 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" event={"ID":"693bba87-d346-4a84-8289-274e437065d0","Type":"ContainerStarted","Data":"cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6"} Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.733978 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" event={"ID":"693bba87-d346-4a84-8289-274e437065d0","Type":"ContainerStarted","Data":"7e2621c59a3301d1a6379326227b843d5fb44524487f3edf1d7bab1865b2d74f"} Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.734526 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.737051 4842 generic.go:334] "Generic (PLEG): container finished" podID="60214716-6377-46b4-9c9e-adc90ffca659" containerID="8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb" exitCode=0 Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.737132 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bslz" event={"ID":"60214716-6377-46b4-9c9e-adc90ffca659","Type":"ContainerDied","Data":"8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb"} Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.741744 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.743697 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hr6hd" event={"ID":"f46a8d85-7384-4cdc-a19d-92a477bcc7d6","Type":"ContainerStarted","Data":"ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1"} Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.758459 4842 scope.go:117] "RemoveContainer" containerID="3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.798326 4842 scope.go:117] "RemoveContainer" containerID="54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.820412 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvmzx"] Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.824156 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvmzx"] Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.831797 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" podStartSLOduration=2.831769003 podStartE2EDuration="2.831769003s" podCreationTimestamp="2026-03-11 18:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:54:00.831227254 +0000 UTC m=+286.478923534" watchObservedRunningTime="2026-03-11 18:54:00.831769003 +0000 UTC m=+286.479465283" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.865583 4842 scope.go:117] "RemoveContainer" containerID="e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e" Mar 11 18:54:00 crc kubenswrapper[4842]: E0311 18:54:00.866189 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e\": container with ID starting with e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e not found: ID does not exist" containerID="e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.866227 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e"} err="failed to get container status \"e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e\": rpc error: code = NotFound desc = could not find container \"e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e\": container with ID starting with e54c482639938df92f84074016d979611246f3864dba8fc9fe2853609fd28d8e not found: ID does not exist" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.866249 4842 scope.go:117] "RemoveContainer" containerID="3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d" Mar 11 18:54:00 crc kubenswrapper[4842]: E0311 18:54:00.866609 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d\": container with ID starting with 3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d not found: ID does not exist" containerID="3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.866652 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d"} err="failed to get container status \"3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d\": rpc error: code = NotFound desc = could not find container \"3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d\": container with ID starting with 3b5f9daa7b48f570c2c2a184ee9082f6dc44cb0dfb6fec70eba8ba03e87be00d not found: ID does not exist" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.866682 4842 scope.go:117] "RemoveContainer" containerID="54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53" Mar 11 18:54:00 crc kubenswrapper[4842]: E0311 18:54:00.866994 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53\": container with ID starting with 54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53 not found: ID does not exist" containerID="54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.867020 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53"} err="failed to get container status \"54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53\": rpc error: code = NotFound desc = could not find container \"54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53\": container with ID starting with 54c0928d073db1c77cff93157a98d8742aa28e6dd5cdf8f8110e7e1c743a3b53 not found: ID does not exist" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.902615 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554254-vzflg"] Mar 11 18:54:00 crc kubenswrapper[4842]: W0311 18:54:00.907463 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod842d8359_baaa_48cc_b80f_28a6e0045e8b.slice/crio-7bda5da6a95e3fc375addfbcf80383b93427ecb6c183e0eb1087d2792a8375ce WatchSource:0}: Error finding container 7bda5da6a95e3fc375addfbcf80383b93427ecb6c183e0eb1087d2792a8375ce: Status 404 returned error can't find the container with id 7bda5da6a95e3fc375addfbcf80383b93427ecb6c183e0eb1087d2792a8375ce Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.969747 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22097394-ae90-446c-9114-14a1f1d184bb" path="/var/lib/kubelet/pods/22097394-ae90-446c-9114-14a1f1d184bb/volumes" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.970472 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f7f8fd-0d14-4fd9-8e5c-cb170633746b" path="/var/lib/kubelet/pods/98f7f8fd-0d14-4fd9-8e5c-cb170633746b/volumes" Mar 11 18:54:00 crc kubenswrapper[4842]: I0311 18:54:00.971159 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005" path="/var/lib/kubelet/pods/ab4fb9ce-f75e-4bcb-a5a3-c2b8db827005/volumes" Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.472292 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.472637 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.472683 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.473806 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.473890 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd" gracePeriod=600 Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.752887 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rrvh" event={"ID":"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7","Type":"ContainerStarted","Data":"e64d291da777c56c62b53f1ae235ae903161067e39a1db556eaca3a58b30eb82"} Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.754461 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd" exitCode=0 Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.754535 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd"} Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.755491 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554254-vzflg" event={"ID":"842d8359-baaa-48cc-b80f-28a6e0045e8b","Type":"ContainerStarted","Data":"7bda5da6a95e3fc375addfbcf80383b93427ecb6c183e0eb1087d2792a8375ce"} Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.758305 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bslz" event={"ID":"60214716-6377-46b4-9c9e-adc90ffca659","Type":"ContainerStarted","Data":"e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003"} Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.759882 4842 generic.go:334] "Generic (PLEG): container finished" podID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerID="ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1" exitCode=0 Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.759916 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hr6hd" event={"ID":"f46a8d85-7384-4cdc-a19d-92a477bcc7d6","Type":"ContainerDied","Data":"ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1"} Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.776999 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6rrvh" podStartSLOduration=2.7661474889999997 podStartE2EDuration="56.776970957s" podCreationTimestamp="2026-03-11 18:53:05 +0000 UTC" firstStartedPulling="2026-03-11 18:53:07.200966782 +0000 UTC m=+232.848663062" lastFinishedPulling="2026-03-11 18:54:01.21179025 +0000 UTC m=+286.859486530" observedRunningTime="2026-03-11 18:54:01.77538943 +0000 UTC m=+287.423085720" watchObservedRunningTime="2026-03-11 18:54:01.776970957 +0000 UTC m=+287.424667237" Mar 11 18:54:01 crc kubenswrapper[4842]: I0311 18:54:01.799138 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7bslz" podStartSLOduration=2.82567143 podStartE2EDuration="57.79911944s" podCreationTimestamp="2026-03-11 18:53:04 +0000 UTC" firstStartedPulling="2026-03-11 18:53:06.182987658 +0000 UTC m=+231.830683938" lastFinishedPulling="2026-03-11 18:54:01.156435658 +0000 UTC m=+286.804131948" observedRunningTime="2026-03-11 18:54:01.797088927 +0000 UTC m=+287.444785207" watchObservedRunningTime="2026-03-11 18:54:01.79911944 +0000 UTC m=+287.446815720" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.072197 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-945c7c594-g7j67"] Mar 11 18:54:02 crc kubenswrapper[4842]: E0311 18:54:02.072438 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="registry-server" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.072453 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="registry-server" Mar 11 18:54:02 crc kubenswrapper[4842]: E0311 18:54:02.072462 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="extract-content" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.072470 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="extract-content" Mar 11 18:54:02 crc kubenswrapper[4842]: E0311 18:54:02.072491 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="extract-utilities" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.072502 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="extract-utilities" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.072608 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="22097394-ae90-446c-9114-14a1f1d184bb" containerName="registry-server" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.073095 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.075354 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.075608 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.075980 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.076330 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.079038 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.079220 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.084842 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.094573 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-945c7c594-g7j67"] Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.247222 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-proxy-ca-bundles\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.247665 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-client-ca\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.247824 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-config\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.247867 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm7mj\" (UniqueName: \"kubernetes.io/projected/a6bba7c5-39f3-4c22-9561-688c47d0fba9-kube-api-access-wm7mj\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.248031 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6bba7c5-39f3-4c22-9561-688c47d0fba9-serving-cert\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.349693 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-proxy-ca-bundles\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.349753 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-client-ca\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.349798 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-config\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.349818 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm7mj\" (UniqueName: \"kubernetes.io/projected/a6bba7c5-39f3-4c22-9561-688c47d0fba9-kube-api-access-wm7mj\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.349858 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6bba7c5-39f3-4c22-9561-688c47d0fba9-serving-cert\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.350945 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-proxy-ca-bundles\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.350957 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-client-ca\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.351504 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-config\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.357504 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6bba7c5-39f3-4c22-9561-688c47d0fba9-serving-cert\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.368500 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm7mj\" (UniqueName: \"kubernetes.io/projected/a6bba7c5-39f3-4c22-9561-688c47d0fba9-kube-api-access-wm7mj\") pod \"controller-manager-945c7c594-g7j67\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.393675 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.600056 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-945c7c594-g7j67"] Mar 11 18:54:02 crc kubenswrapper[4842]: W0311 18:54:02.608725 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6bba7c5_39f3_4c22_9561_688c47d0fba9.slice/crio-c82591a9ee489944934a37f7f1c3d84ff3daf251ecab4f77efa06747d5fbe1e4 WatchSource:0}: Error finding container c82591a9ee489944934a37f7f1c3d84ff3daf251ecab4f77efa06747d5fbe1e4: Status 404 returned error can't find the container with id c82591a9ee489944934a37f7f1c3d84ff3daf251ecab4f77efa06747d5fbe1e4 Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.770443 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554254-vzflg" event={"ID":"842d8359-baaa-48cc-b80f-28a6e0045e8b","Type":"ContainerStarted","Data":"a26553ba937c4505a429758298d6c48f723370ed334257b27cdce5bde464d91a"} Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.776521 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" event={"ID":"a6bba7c5-39f3-4c22-9561-688c47d0fba9","Type":"ContainerStarted","Data":"c82591a9ee489944934a37f7f1c3d84ff3daf251ecab4f77efa06747d5fbe1e4"} Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.781666 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"b4a7b65bec67b2a820939afb6031ff12cd763991ed39162f5d44f041b4219c2a"} Mar 11 18:54:02 crc kubenswrapper[4842]: I0311 18:54:02.789966 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29554254-vzflg" podStartSLOduration=1.698994396 podStartE2EDuration="2.789950208s" podCreationTimestamp="2026-03-11 18:54:00 +0000 UTC" firstStartedPulling="2026-03-11 18:54:00.910737401 +0000 UTC m=+286.558433681" lastFinishedPulling="2026-03-11 18:54:02.001693213 +0000 UTC m=+287.649389493" observedRunningTime="2026-03-11 18:54:02.785516329 +0000 UTC m=+288.433212609" watchObservedRunningTime="2026-03-11 18:54:02.789950208 +0000 UTC m=+288.437646488" Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.789499 4842 generic.go:334] "Generic (PLEG): container finished" podID="842d8359-baaa-48cc-b80f-28a6e0045e8b" containerID="a26553ba937c4505a429758298d6c48f723370ed334257b27cdce5bde464d91a" exitCode=0 Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.789555 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554254-vzflg" event={"ID":"842d8359-baaa-48cc-b80f-28a6e0045e8b","Type":"ContainerDied","Data":"a26553ba937c4505a429758298d6c48f723370ed334257b27cdce5bde464d91a"} Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.791726 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" event={"ID":"a6bba7c5-39f3-4c22-9561-688c47d0fba9","Type":"ContainerStarted","Data":"c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e"} Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.791947 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.795392 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hr6hd" event={"ID":"f46a8d85-7384-4cdc-a19d-92a477bcc7d6","Type":"ContainerStarted","Data":"317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c"} Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.798233 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.835867 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" podStartSLOduration=5.835848727 podStartE2EDuration="5.835848727s" podCreationTimestamp="2026-03-11 18:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:54:03.833659909 +0000 UTC m=+289.481356199" watchObservedRunningTime="2026-03-11 18:54:03.835848727 +0000 UTC m=+289.483545007" Mar 11 18:54:03 crc kubenswrapper[4842]: I0311 18:54:03.859413 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hr6hd" podStartSLOduration=3.175453661 podStartE2EDuration="56.859395751s" podCreationTimestamp="2026-03-11 18:53:07 +0000 UTC" firstStartedPulling="2026-03-11 18:53:09.257313766 +0000 UTC m=+234.905010046" lastFinishedPulling="2026-03-11 18:54:02.941255856 +0000 UTC m=+288.588952136" observedRunningTime="2026-03-11 18:54:03.859199554 +0000 UTC m=+289.506895834" watchObservedRunningTime="2026-03-11 18:54:03.859395751 +0000 UTC m=+289.507092031" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.117319 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.117391 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.177753 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.361603 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.546028 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.546076 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.584213 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:54:04 crc kubenswrapper[4842]: I0311 18:54:04.845166 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.028923 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554254-vzflg" Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.077493 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.120225 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.191747 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl67g\" (UniqueName: \"kubernetes.io/projected/842d8359-baaa-48cc-b80f-28a6e0045e8b-kube-api-access-bl67g\") pod \"842d8359-baaa-48cc-b80f-28a6e0045e8b\" (UID: \"842d8359-baaa-48cc-b80f-28a6e0045e8b\") " Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.198348 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842d8359-baaa-48cc-b80f-28a6e0045e8b-kube-api-access-bl67g" (OuterVolumeSpecName: "kube-api-access-bl67g") pod "842d8359-baaa-48cc-b80f-28a6e0045e8b" (UID: "842d8359-baaa-48cc-b80f-28a6e0045e8b"). InnerVolumeSpecName "kube-api-access-bl67g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.293124 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl67g\" (UniqueName: \"kubernetes.io/projected/842d8359-baaa-48cc-b80f-28a6e0045e8b-kube-api-access-bl67g\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.824549 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554254-vzflg" event={"ID":"842d8359-baaa-48cc-b80f-28a6e0045e8b","Type":"ContainerDied","Data":"7bda5da6a95e3fc375addfbcf80383b93427ecb6c183e0eb1087d2792a8375ce"} Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.824990 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bda5da6a95e3fc375addfbcf80383b93427ecb6c183e0eb1087d2792a8375ce" Mar 11 18:54:05 crc kubenswrapper[4842]: I0311 18:54:05.825061 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554254-vzflg" Mar 11 18:54:06 crc kubenswrapper[4842]: I0311 18:54:06.263177 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:54:06 crc kubenswrapper[4842]: I0311 18:54:06.263375 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:54:06 crc kubenswrapper[4842]: I0311 18:54:06.309081 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:54:06 crc kubenswrapper[4842]: I0311 18:54:06.880178 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.298403 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.339947 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.393457 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d64lq"] Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.393659 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d64lq" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="registry-server" containerID="cri-o://816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8" gracePeriod=2 Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.633634 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.633686 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.776020 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.836245 4842 generic.go:334] "Generic (PLEG): container finished" podID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerID="816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8" exitCode=0 Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.836331 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d64lq" event={"ID":"a0ef18e4-d9a7-4122-89ed-b556ed419954","Type":"ContainerDied","Data":"816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8"} Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.836415 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d64lq" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.836445 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d64lq" event={"ID":"a0ef18e4-d9a7-4122-89ed-b556ed419954","Type":"ContainerDied","Data":"7bcec6492fefc7f7ebe3fd14a5cc50bd50d6229b67a881406a88142ea6a64115"} Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.836476 4842 scope.go:117] "RemoveContainer" containerID="816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.853310 4842 scope.go:117] "RemoveContainer" containerID="9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.875927 4842 scope.go:117] "RemoveContainer" containerID="204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.892447 4842 scope.go:117] "RemoveContainer" containerID="816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8" Mar 11 18:54:07 crc kubenswrapper[4842]: E0311 18:54:07.892776 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8\": container with ID starting with 816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8 not found: ID does not exist" containerID="816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.892847 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8"} err="failed to get container status \"816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8\": rpc error: code = NotFound desc = could not find container \"816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8\": container with ID starting with 816ab7659484f5a99eb4cd480af2b2824d58e9a594d05a570bf01f3fee034ca8 not found: ID does not exist" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.892875 4842 scope.go:117] "RemoveContainer" containerID="9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325" Mar 11 18:54:07 crc kubenswrapper[4842]: E0311 18:54:07.893128 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325\": container with ID starting with 9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325 not found: ID does not exist" containerID="9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.893162 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325"} err="failed to get container status \"9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325\": rpc error: code = NotFound desc = could not find container \"9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325\": container with ID starting with 9cc17377922a096e329415ae60daf63345eb318b2da2ec6065b610c08b4fb325 not found: ID does not exist" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.893189 4842 scope.go:117] "RemoveContainer" containerID="204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44" Mar 11 18:54:07 crc kubenswrapper[4842]: E0311 18:54:07.893419 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44\": container with ID starting with 204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44 not found: ID does not exist" containerID="204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.893449 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44"} err="failed to get container status \"204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44\": rpc error: code = NotFound desc = could not find container \"204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44\": container with ID starting with 204dac86f6555269ad63b2c9be8f2485d08e9ee580010b3cc961347c4d87fc44 not found: ID does not exist" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.928261 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-utilities\") pod \"a0ef18e4-d9a7-4122-89ed-b556ed419954\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.928383 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gld7g\" (UniqueName: \"kubernetes.io/projected/a0ef18e4-d9a7-4122-89ed-b556ed419954-kube-api-access-gld7g\") pod \"a0ef18e4-d9a7-4122-89ed-b556ed419954\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.928459 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-catalog-content\") pod \"a0ef18e4-d9a7-4122-89ed-b556ed419954\" (UID: \"a0ef18e4-d9a7-4122-89ed-b556ed419954\") " Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.929019 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-utilities" (OuterVolumeSpecName: "utilities") pod "a0ef18e4-d9a7-4122-89ed-b556ed419954" (UID: "a0ef18e4-d9a7-4122-89ed-b556ed419954"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.930151 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.932704 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ef18e4-d9a7-4122-89ed-b556ed419954-kube-api-access-gld7g" (OuterVolumeSpecName: "kube-api-access-gld7g") pod "a0ef18e4-d9a7-4122-89ed-b556ed419954" (UID: "a0ef18e4-d9a7-4122-89ed-b556ed419954"). InnerVolumeSpecName "kube-api-access-gld7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:07 crc kubenswrapper[4842]: I0311 18:54:07.980592 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0ef18e4-d9a7-4122-89ed-b556ed419954" (UID: "a0ef18e4-d9a7-4122-89ed-b556ed419954"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:08 crc kubenswrapper[4842]: I0311 18:54:08.031793 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gld7g\" (UniqueName: \"kubernetes.io/projected/a0ef18e4-d9a7-4122-89ed-b556ed419954-kube-api-access-gld7g\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:08 crc kubenswrapper[4842]: I0311 18:54:08.031833 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ef18e4-d9a7-4122-89ed-b556ed419954-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:08 crc kubenswrapper[4842]: I0311 18:54:08.163511 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d64lq"] Mar 11 18:54:08 crc kubenswrapper[4842]: I0311 18:54:08.165354 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d64lq"] Mar 11 18:54:08 crc kubenswrapper[4842]: I0311 18:54:08.678602 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hr6hd" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="registry-server" probeResult="failure" output=< Mar 11 18:54:08 crc kubenswrapper[4842]: timeout: failed to connect service ":50051" within 1s Mar 11 18:54:08 crc kubenswrapper[4842]: > Mar 11 18:54:08 crc kubenswrapper[4842]: I0311 18:54:08.969105 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" path="/var/lib/kubelet/pods/a0ef18e4-d9a7-4122-89ed-b556ed419954/volumes" Mar 11 18:54:14 crc kubenswrapper[4842]: I0311 18:54:14.590511 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:54:14 crc kubenswrapper[4842]: I0311 18:54:14.639879 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bslz"] Mar 11 18:54:14 crc kubenswrapper[4842]: I0311 18:54:14.880521 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7bslz" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="registry-server" containerID="cri-o://e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003" gracePeriod=2 Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.438912 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.635693 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-utilities\") pod \"60214716-6377-46b4-9c9e-adc90ffca659\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.635788 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5gmt\" (UniqueName: \"kubernetes.io/projected/60214716-6377-46b4-9c9e-adc90ffca659-kube-api-access-d5gmt\") pod \"60214716-6377-46b4-9c9e-adc90ffca659\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.635826 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-catalog-content\") pod \"60214716-6377-46b4-9c9e-adc90ffca659\" (UID: \"60214716-6377-46b4-9c9e-adc90ffca659\") " Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.636396 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-utilities" (OuterVolumeSpecName: "utilities") pod "60214716-6377-46b4-9c9e-adc90ffca659" (UID: "60214716-6377-46b4-9c9e-adc90ffca659"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.636661 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.642458 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60214716-6377-46b4-9c9e-adc90ffca659-kube-api-access-d5gmt" (OuterVolumeSpecName: "kube-api-access-d5gmt") pod "60214716-6377-46b4-9c9e-adc90ffca659" (UID: "60214716-6377-46b4-9c9e-adc90ffca659"). InnerVolumeSpecName "kube-api-access-d5gmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.690402 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60214716-6377-46b4-9c9e-adc90ffca659" (UID: "60214716-6377-46b4-9c9e-adc90ffca659"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.737931 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5gmt\" (UniqueName: \"kubernetes.io/projected/60214716-6377-46b4-9c9e-adc90ffca659-kube-api-access-d5gmt\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.737990 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60214716-6377-46b4-9c9e-adc90ffca659-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.888886 4842 generic.go:334] "Generic (PLEG): container finished" podID="60214716-6377-46b4-9c9e-adc90ffca659" containerID="e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003" exitCode=0 Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.888964 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bslz" event={"ID":"60214716-6377-46b4-9c9e-adc90ffca659","Type":"ContainerDied","Data":"e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003"} Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.888995 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bslz" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.889031 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bslz" event={"ID":"60214716-6377-46b4-9c9e-adc90ffca659","Type":"ContainerDied","Data":"7ff95503cf1d4dc05f4e88b1c37fb010aea1c91f62e5a86745dbae366afca348"} Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.889072 4842 scope.go:117] "RemoveContainer" containerID="e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.923459 4842 scope.go:117] "RemoveContainer" containerID="8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.931320 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bslz"] Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.936826 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7bslz"] Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.956343 4842 scope.go:117] "RemoveContainer" containerID="386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.976124 4842 scope.go:117] "RemoveContainer" containerID="e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003" Mar 11 18:54:15 crc kubenswrapper[4842]: E0311 18:54:15.976965 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003\": container with ID starting with e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003 not found: ID does not exist" containerID="e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.977026 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003"} err="failed to get container status \"e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003\": rpc error: code = NotFound desc = could not find container \"e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003\": container with ID starting with e42a906fc859a13803b0eb1e6d807c818d1baaa66504847f97aa85b59ea88003 not found: ID does not exist" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.977061 4842 scope.go:117] "RemoveContainer" containerID="8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb" Mar 11 18:54:15 crc kubenswrapper[4842]: E0311 18:54:15.977742 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb\": container with ID starting with 8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb not found: ID does not exist" containerID="8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.977783 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb"} err="failed to get container status \"8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb\": rpc error: code = NotFound desc = could not find container \"8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb\": container with ID starting with 8ff7b0ae9880875fc55031239e044e60354bc171f5e2263c610dc4f9c745dedb not found: ID does not exist" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.977824 4842 scope.go:117] "RemoveContainer" containerID="386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431" Mar 11 18:54:15 crc kubenswrapper[4842]: E0311 18:54:15.978330 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431\": container with ID starting with 386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431 not found: ID does not exist" containerID="386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431" Mar 11 18:54:15 crc kubenswrapper[4842]: I0311 18:54:15.978374 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431"} err="failed to get container status \"386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431\": rpc error: code = NotFound desc = could not find container \"386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431\": container with ID starting with 386a99c5dbea566d5ff24b6ab17da7676df7289b8ddbf811d9fe7873891d3431 not found: ID does not exist" Mar 11 18:54:16 crc kubenswrapper[4842]: I0311 18:54:16.968896 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60214716-6377-46b4-9c9e-adc90ffca659" path="/var/lib/kubelet/pods/60214716-6377-46b4-9c9e-adc90ffca659/volumes" Mar 11 18:54:17 crc kubenswrapper[4842]: I0311 18:54:17.326154 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2flrr"] Mar 11 18:54:17 crc kubenswrapper[4842]: I0311 18:54:17.672407 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:54:17 crc kubenswrapper[4842]: I0311 18:54:17.715404 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.144871 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-945c7c594-g7j67"] Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.147333 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" podUID="a6bba7c5-39f3-4c22-9561-688c47d0fba9" containerName="controller-manager" containerID="cri-o://c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e" gracePeriod=30 Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.245972 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8"] Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.246167 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" podUID="693bba87-d346-4a84-8289-274e437065d0" containerName="route-controller-manager" containerID="cri-o://cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6" gracePeriod=30 Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.742382 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.773945 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785310 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-client-ca\") pod \"693bba87-d346-4a84-8289-274e437065d0\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785389 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm7mj\" (UniqueName: \"kubernetes.io/projected/a6bba7c5-39f3-4c22-9561-688c47d0fba9-kube-api-access-wm7mj\") pod \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785571 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45sgl\" (UniqueName: \"kubernetes.io/projected/693bba87-d346-4a84-8289-274e437065d0-kube-api-access-45sgl\") pod \"693bba87-d346-4a84-8289-274e437065d0\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785639 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-client-ca\") pod \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785688 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-config\") pod \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785760 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6bba7c5-39f3-4c22-9561-688c47d0fba9-serving-cert\") pod \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785790 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693bba87-d346-4a84-8289-274e437065d0-serving-cert\") pod \"693bba87-d346-4a84-8289-274e437065d0\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785818 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-proxy-ca-bundles\") pod \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\" (UID: \"a6bba7c5-39f3-4c22-9561-688c47d0fba9\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.785856 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-config\") pod \"693bba87-d346-4a84-8289-274e437065d0\" (UID: \"693bba87-d346-4a84-8289-274e437065d0\") " Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.786660 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "693bba87-d346-4a84-8289-274e437065d0" (UID: "693bba87-d346-4a84-8289-274e437065d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.786731 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-config" (OuterVolumeSpecName: "config") pod "693bba87-d346-4a84-8289-274e437065d0" (UID: "693bba87-d346-4a84-8289-274e437065d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.787416 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-config" (OuterVolumeSpecName: "config") pod "a6bba7c5-39f3-4c22-9561-688c47d0fba9" (UID: "a6bba7c5-39f3-4c22-9561-688c47d0fba9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.789664 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a6bba7c5-39f3-4c22-9561-688c47d0fba9" (UID: "a6bba7c5-39f3-4c22-9561-688c47d0fba9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.789922 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-client-ca" (OuterVolumeSpecName: "client-ca") pod "a6bba7c5-39f3-4c22-9561-688c47d0fba9" (UID: "a6bba7c5-39f3-4c22-9561-688c47d0fba9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.794988 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bba7c5-39f3-4c22-9561-688c47d0fba9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a6bba7c5-39f3-4c22-9561-688c47d0fba9" (UID: "a6bba7c5-39f3-4c22-9561-688c47d0fba9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.795092 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693bba87-d346-4a84-8289-274e437065d0-kube-api-access-45sgl" (OuterVolumeSpecName: "kube-api-access-45sgl") pod "693bba87-d346-4a84-8289-274e437065d0" (UID: "693bba87-d346-4a84-8289-274e437065d0"). InnerVolumeSpecName "kube-api-access-45sgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.795334 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bba7c5-39f3-4c22-9561-688c47d0fba9-kube-api-access-wm7mj" (OuterVolumeSpecName: "kube-api-access-wm7mj") pod "a6bba7c5-39f3-4c22-9561-688c47d0fba9" (UID: "a6bba7c5-39f3-4c22-9561-688c47d0fba9"). InnerVolumeSpecName "kube-api-access-wm7mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.795664 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/693bba87-d346-4a84-8289-274e437065d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "693bba87-d346-4a84-8289-274e437065d0" (UID: "693bba87-d346-4a84-8289-274e437065d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887552 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6bba7c5-39f3-4c22-9561-688c47d0fba9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887587 4842 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/693bba87-d346-4a84-8289-274e437065d0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887596 4842 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887610 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887620 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/693bba87-d346-4a84-8289-274e437065d0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887629 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm7mj\" (UniqueName: \"kubernetes.io/projected/a6bba7c5-39f3-4c22-9561-688c47d0fba9-kube-api-access-wm7mj\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887638 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45sgl\" (UniqueName: \"kubernetes.io/projected/693bba87-d346-4a84-8289-274e437065d0-kube-api-access-45sgl\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887647 4842 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.887658 4842 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6bba7c5-39f3-4c22-9561-688c47d0fba9-config\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.911173 4842 generic.go:334] "Generic (PLEG): container finished" podID="a6bba7c5-39f3-4c22-9561-688c47d0fba9" containerID="c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e" exitCode=0 Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.911241 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.911315 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" event={"ID":"a6bba7c5-39f3-4c22-9561-688c47d0fba9","Type":"ContainerDied","Data":"c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e"} Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.911398 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945c7c594-g7j67" event={"ID":"a6bba7c5-39f3-4c22-9561-688c47d0fba9","Type":"ContainerDied","Data":"c82591a9ee489944934a37f7f1c3d84ff3daf251ecab4f77efa06747d5fbe1e4"} Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.911430 4842 scope.go:117] "RemoveContainer" containerID="c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.915115 4842 generic.go:334] "Generic (PLEG): container finished" podID="693bba87-d346-4a84-8289-274e437065d0" containerID="cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6" exitCode=0 Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.915175 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" event={"ID":"693bba87-d346-4a84-8289-274e437065d0","Type":"ContainerDied","Data":"cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6"} Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.915220 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" event={"ID":"693bba87-d346-4a84-8289-274e437065d0","Type":"ContainerDied","Data":"7e2621c59a3301d1a6379326227b843d5fb44524487f3edf1d7bab1865b2d74f"} Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.915365 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.928828 4842 scope.go:117] "RemoveContainer" containerID="c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e" Mar 11 18:54:18 crc kubenswrapper[4842]: E0311 18:54:18.929455 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e\": container with ID starting with c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e not found: ID does not exist" containerID="c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.929528 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e"} err="failed to get container status \"c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e\": rpc error: code = NotFound desc = could not find container \"c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e\": container with ID starting with c4c023e4d7187949a9161ae14788848a7c6e4af5dcc9945a4456c128290a316e not found: ID does not exist" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.929591 4842 scope.go:117] "RemoveContainer" containerID="cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.943879 4842 scope.go:117] "RemoveContainer" containerID="cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6" Mar 11 18:54:18 crc kubenswrapper[4842]: E0311 18:54:18.944359 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6\": container with ID starting with cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6 not found: ID does not exist" containerID="cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.944414 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6"} err="failed to get container status \"cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6\": rpc error: code = NotFound desc = could not find container \"cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6\": container with ID starting with cc64785303c8eb32ce52e30659bfaadc76be5a8da4490a434449a3d2ec8c6cd6 not found: ID does not exist" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.945255 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-945c7c594-g7j67"] Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.949609 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-945c7c594-g7j67"] Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.968219 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bba7c5-39f3-4c22-9561-688c47d0fba9" path="/var/lib/kubelet/pods/a6bba7c5-39f3-4c22-9561-688c47d0fba9/volumes" Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.968626 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8"] Mar 11 18:54:18 crc kubenswrapper[4842]: I0311 18:54:18.970399 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9865db89-k4tf8"] Mar 11 18:54:19 crc kubenswrapper[4842]: I0311 18:54:19.990965 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hr6hd"] Mar 11 18:54:19 crc kubenswrapper[4842]: I0311 18:54:19.992603 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hr6hd" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="registry-server" containerID="cri-o://317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c" gracePeriod=2 Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.201422 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh"] Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.201857 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="registry-server" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.201882 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="registry-server" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.201897 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="registry-server" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.201904 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="registry-server" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.201915 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="extract-utilities" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.201924 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="extract-utilities" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.201939 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bba7c5-39f3-4c22-9561-688c47d0fba9" containerName="controller-manager" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.201946 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bba7c5-39f3-4c22-9561-688c47d0fba9" containerName="controller-manager" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.201959 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="extract-content" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.201966 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="extract-content" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.201980 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="extract-content" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.201987 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="extract-content" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.201996 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="extract-utilities" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202002 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="extract-utilities" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.202016 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842d8359-baaa-48cc-b80f-28a6e0045e8b" containerName="oc" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202022 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d8359-baaa-48cc-b80f-28a6e0045e8b" containerName="oc" Mar 11 18:54:20 crc kubenswrapper[4842]: E0311 18:54:20.202035 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693bba87-d346-4a84-8289-274e437065d0" containerName="route-controller-manager" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202041 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="693bba87-d346-4a84-8289-274e437065d0" containerName="route-controller-manager" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202193 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bba7c5-39f3-4c22-9561-688c47d0fba9" containerName="controller-manager" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202205 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="693bba87-d346-4a84-8289-274e437065d0" containerName="route-controller-manager" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202217 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="842d8359-baaa-48cc-b80f-28a6e0045e8b" containerName="oc" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202234 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="60214716-6377-46b4-9c9e-adc90ffca659" containerName="registry-server" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202241 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ef18e4-d9a7-4122-89ed-b556ed419954" containerName="registry-server" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.202846 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.208996 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq"] Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.209985 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.211988 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.212285 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.212452 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.215677 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.215863 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.216776 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.217031 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.217681 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.218217 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.218567 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.218709 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.218819 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.219588 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.224535 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq"] Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.253839 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh"] Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.398193 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.406932 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7a0526-e2e6-460f-b069-7036446eca16-config\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.406990 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-client-ca\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.407015 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0518472-01fd-4a87-a315-2fb14330907b-serving-cert\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.407050 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-proxy-ca-bundles\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.407669 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7a0526-e2e6-460f-b069-7036446eca16-client-ca\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.407791 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b429h\" (UniqueName: \"kubernetes.io/projected/b0518472-01fd-4a87-a315-2fb14330907b-kube-api-access-b429h\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.407876 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48cxh\" (UniqueName: \"kubernetes.io/projected/6d7a0526-e2e6-460f-b069-7036446eca16-kube-api-access-48cxh\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.408004 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7a0526-e2e6-460f-b069-7036446eca16-serving-cert\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.408091 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-config\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.508618 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-catalog-content\") pod \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.508729 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-utilities\") pod \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.508763 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkkm\" (UniqueName: \"kubernetes.io/projected/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-kube-api-access-ckkkm\") pod \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\" (UID: \"f46a8d85-7384-4cdc-a19d-92a477bcc7d6\") " Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.508930 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7a0526-e2e6-460f-b069-7036446eca16-client-ca\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.508962 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b429h\" (UniqueName: \"kubernetes.io/projected/b0518472-01fd-4a87-a315-2fb14330907b-kube-api-access-b429h\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.508988 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48cxh\" (UniqueName: \"kubernetes.io/projected/6d7a0526-e2e6-460f-b069-7036446eca16-kube-api-access-48cxh\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.509019 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7a0526-e2e6-460f-b069-7036446eca16-serving-cert\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.509046 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-config\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.509071 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7a0526-e2e6-460f-b069-7036446eca16-config\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.509091 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-client-ca\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.509112 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0518472-01fd-4a87-a315-2fb14330907b-serving-cert\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.509142 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-proxy-ca-bundles\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.509688 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-utilities" (OuterVolumeSpecName: "utilities") pod "f46a8d85-7384-4cdc-a19d-92a477bcc7d6" (UID: "f46a8d85-7384-4cdc-a19d-92a477bcc7d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.510436 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-proxy-ca-bundles\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.510728 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-client-ca\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.511855 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7a0526-e2e6-460f-b069-7036446eca16-config\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.511997 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0518472-01fd-4a87-a315-2fb14330907b-config\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.512895 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d7a0526-e2e6-460f-b069-7036446eca16-client-ca\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.516984 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-kube-api-access-ckkkm" (OuterVolumeSpecName: "kube-api-access-ckkkm") pod "f46a8d85-7384-4cdc-a19d-92a477bcc7d6" (UID: "f46a8d85-7384-4cdc-a19d-92a477bcc7d6"). InnerVolumeSpecName "kube-api-access-ckkkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.525389 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0518472-01fd-4a87-a315-2fb14330907b-serving-cert\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.525837 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d7a0526-e2e6-460f-b069-7036446eca16-serving-cert\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.535177 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48cxh\" (UniqueName: \"kubernetes.io/projected/6d7a0526-e2e6-460f-b069-7036446eca16-kube-api-access-48cxh\") pod \"route-controller-manager-7496cd9dd6-nn8wq\" (UID: \"6d7a0526-e2e6-460f-b069-7036446eca16\") " pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.537155 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b429h\" (UniqueName: \"kubernetes.io/projected/b0518472-01fd-4a87-a315-2fb14330907b-kube-api-access-b429h\") pod \"controller-manager-6f8f5cb4b7-872jh\" (UID: \"b0518472-01fd-4a87-a315-2fb14330907b\") " pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.559834 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.569935 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.615059 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.615126 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkkm\" (UniqueName: \"kubernetes.io/projected/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-kube-api-access-ckkkm\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.642857 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f46a8d85-7384-4cdc-a19d-92a477bcc7d6" (UID: "f46a8d85-7384-4cdc-a19d-92a477bcc7d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.718229 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f46a8d85-7384-4cdc-a19d-92a477bcc7d6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.797242 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq"] Mar 11 18:54:20 crc kubenswrapper[4842]: W0311 18:54:20.807355 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d7a0526_e2e6_460f_b069_7036446eca16.slice/crio-917875c7120754d35f080be75f59c3d83418fe76c3389174b9fe26266d1afe2c WatchSource:0}: Error finding container 917875c7120754d35f080be75f59c3d83418fe76c3389174b9fe26266d1afe2c: Status 404 returned error can't find the container with id 917875c7120754d35f080be75f59c3d83418fe76c3389174b9fe26266d1afe2c Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.850893 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh"] Mar 11 18:54:20 crc kubenswrapper[4842]: W0311 18:54:20.858144 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0518472_01fd_4a87_a315_2fb14330907b.slice/crio-d53025e77eb1b99b9393bf743548de8cf8fa8df944a9f81adc336e241092a97e WatchSource:0}: Error finding container d53025e77eb1b99b9393bf743548de8cf8fa8df944a9f81adc336e241092a97e: Status 404 returned error can't find the container with id d53025e77eb1b99b9393bf743548de8cf8fa8df944a9f81adc336e241092a97e Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.933022 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" event={"ID":"6d7a0526-e2e6-460f-b069-7036446eca16","Type":"ContainerStarted","Data":"917875c7120754d35f080be75f59c3d83418fe76c3389174b9fe26266d1afe2c"} Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.935923 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" event={"ID":"b0518472-01fd-4a87-a315-2fb14330907b","Type":"ContainerStarted","Data":"d53025e77eb1b99b9393bf743548de8cf8fa8df944a9f81adc336e241092a97e"} Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.939825 4842 generic.go:334] "Generic (PLEG): container finished" podID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerID="317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c" exitCode=0 Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.939865 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hr6hd" event={"ID":"f46a8d85-7384-4cdc-a19d-92a477bcc7d6","Type":"ContainerDied","Data":"317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c"} Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.939887 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hr6hd" event={"ID":"f46a8d85-7384-4cdc-a19d-92a477bcc7d6","Type":"ContainerDied","Data":"dce966aa1a6948d41d28e0906ec625492e4b552c28a098723ddeb90f85e762be"} Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.939904 4842 scope.go:117] "RemoveContainer" containerID="317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.940020 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hr6hd" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.960785 4842 scope.go:117] "RemoveContainer" containerID="ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.970307 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693bba87-d346-4a84-8289-274e437065d0" path="/var/lib/kubelet/pods/693bba87-d346-4a84-8289-274e437065d0/volumes" Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.971415 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hr6hd"] Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.974192 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hr6hd"] Mar 11 18:54:20 crc kubenswrapper[4842]: I0311 18:54:20.987216 4842 scope.go:117] "RemoveContainer" containerID="43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.001116 4842 scope.go:117] "RemoveContainer" containerID="317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c" Mar 11 18:54:21 crc kubenswrapper[4842]: E0311 18:54:21.001876 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c\": container with ID starting with 317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c not found: ID does not exist" containerID="317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.001916 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c"} err="failed to get container status \"317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c\": rpc error: code = NotFound desc = could not find container \"317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c\": container with ID starting with 317c488c7e9dba0f510a0603cb2ab24ced1d4663ad41c356f1bb0288fef99b7c not found: ID does not exist" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.001948 4842 scope.go:117] "RemoveContainer" containerID="ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1" Mar 11 18:54:21 crc kubenswrapper[4842]: E0311 18:54:21.002418 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1\": container with ID starting with ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1 not found: ID does not exist" containerID="ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.002447 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1"} err="failed to get container status \"ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1\": rpc error: code = NotFound desc = could not find container \"ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1\": container with ID starting with ddbb68bea8608f0bca445db8429c50cb99f6de71da0b4736171f7c8ceb7400f1 not found: ID does not exist" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.002465 4842 scope.go:117] "RemoveContainer" containerID="43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133" Mar 11 18:54:21 crc kubenswrapper[4842]: E0311 18:54:21.002957 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133\": container with ID starting with 43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133 not found: ID does not exist" containerID="43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.002988 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133"} err="failed to get container status \"43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133\": rpc error: code = NotFound desc = could not find container \"43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133\": container with ID starting with 43e00a126a5c0b03ed2137a99a1f937ebfedaf393e35dfe8f34f179ba429f133 not found: ID does not exist" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.946227 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" event={"ID":"6d7a0526-e2e6-460f-b069-7036446eca16","Type":"ContainerStarted","Data":"73d2364f34ca72c27e31593fa2a6888ac94fe4ffb6f2ac2e0596c69d85358266"} Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.946679 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.947910 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" event={"ID":"b0518472-01fd-4a87-a315-2fb14330907b","Type":"ContainerStarted","Data":"f735efa133c51c67a063d96eb9a13ff7a62e338b6fcd117ca3f120858d2b93f6"} Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.948611 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.953649 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.954856 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.968365 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7496cd9dd6-nn8wq" podStartSLOduration=3.968347219 podStartE2EDuration="3.968347219s" podCreationTimestamp="2026-03-11 18:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:54:21.964433979 +0000 UTC m=+307.612130269" watchObservedRunningTime="2026-03-11 18:54:21.968347219 +0000 UTC m=+307.616043499" Mar 11 18:54:21 crc kubenswrapper[4842]: I0311 18:54:21.982834 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f8f5cb4b7-872jh" podStartSLOduration=3.982814557 podStartE2EDuration="3.982814557s" podCreationTimestamp="2026-03-11 18:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:54:21.981240461 +0000 UTC m=+307.628936771" watchObservedRunningTime="2026-03-11 18:54:21.982814557 +0000 UTC m=+307.630510837" Mar 11 18:54:22 crc kubenswrapper[4842]: I0311 18:54:22.967033 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" path="/var/lib/kubelet/pods/f46a8d85-7384-4cdc-a19d-92a477bcc7d6/volumes" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.225457 4842 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.225925 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="extract-utilities" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.225937 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="extract-utilities" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.225954 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="extract-content" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.225960 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="extract-content" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.225971 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="registry-server" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.225976 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="registry-server" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226062 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46a8d85-7384-4cdc-a19d-92a477bcc7d6" containerName="registry-server" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226377 4842 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226528 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226679 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04" gracePeriod=15 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226720 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a" gracePeriod=15 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226720 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541" gracePeriod=15 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226759 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33" gracePeriod=15 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.226849 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301" gracePeriod=15 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227336 4842 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227587 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227599 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227613 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227620 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227632 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227639 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227647 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227655 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227669 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227676 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227688 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227695 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227703 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227711 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227726 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227734 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.227745 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.227752 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228199 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228212 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228224 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228233 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228241 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228249 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228259 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228286 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.228437 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228732 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.228835 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.267371 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.294956 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.295001 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.295035 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.295097 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.295116 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.295132 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.295147 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.295171 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.395958 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396110 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396371 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396415 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396515 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396495 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396721 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396773 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396806 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396826 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396835 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396862 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396881 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396872 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396915 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.396972 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.564139 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:54:27 crc kubenswrapper[4842]: E0311 18:54:27.584004 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.251:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189bde455064f962 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:54:27.583326562 +0000 UTC m=+313.231022842,LastTimestamp:2026-03-11 18:54:27.583326562 +0000 UTC m=+313.231022842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.983234 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1"} Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.983307 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f8a55f6051313db2c21024f6a9487f36fdb739687592903a9e3fb0787ce863da"} Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.983927 4842 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.984282 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.984820 4842 generic.go:334] "Generic (PLEG): container finished" podID="f2484672-71b2-46e3-9ede-780d3e1aaafc" containerID="70b3e586c0b04c5256f011269af604504923cfcdb75443fb6e44ebbb04e68bb5" exitCode=0 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.984883 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f2484672-71b2-46e3-9ede-780d3e1aaafc","Type":"ContainerDied","Data":"70b3e586c0b04c5256f011269af604504923cfcdb75443fb6e44ebbb04e68bb5"} Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.985423 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.985662 4842 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.986053 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.987977 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.989697 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.990409 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33" exitCode=0 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.990431 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541" exitCode=0 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.990442 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a" exitCode=0 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.990451 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301" exitCode=2 Mar 11 18:54:27 crc kubenswrapper[4842]: I0311 18:54:27.990497 4842 scope.go:117] "RemoveContainer" containerID="22d329d32953b53484a66cf06ddded57040c67a0fe3ffcfe90d01ba240839c15" Mar 11 18:54:28 crc kubenswrapper[4842]: I0311 18:54:28.814467 4842 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 11 18:54:28 crc kubenswrapper[4842]: I0311 18:54:28.814801 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.003315 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.351416 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.352499 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.353082 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.424066 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2484672-71b2-46e3-9ede-780d3e1aaafc-kube-api-access\") pod \"f2484672-71b2-46e3-9ede-780d3e1aaafc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.424191 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-kubelet-dir\") pod \"f2484672-71b2-46e3-9ede-780d3e1aaafc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.424225 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-var-lock\") pod \"f2484672-71b2-46e3-9ede-780d3e1aaafc\" (UID: \"f2484672-71b2-46e3-9ede-780d3e1aaafc\") " Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.424796 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-var-lock" (OuterVolumeSpecName: "var-lock") pod "f2484672-71b2-46e3-9ede-780d3e1aaafc" (UID: "f2484672-71b2-46e3-9ede-780d3e1aaafc"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.425137 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f2484672-71b2-46e3-9ede-780d3e1aaafc" (UID: "f2484672-71b2-46e3-9ede-780d3e1aaafc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.433567 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2484672-71b2-46e3-9ede-780d3e1aaafc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f2484672-71b2-46e3-9ede-780d3e1aaafc" (UID: "f2484672-71b2-46e3-9ede-780d3e1aaafc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.526089 4842 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.526138 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2484672-71b2-46e3-9ede-780d3e1aaafc-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.526153 4842 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2484672-71b2-46e3-9ede-780d3e1aaafc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.587113 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.588467 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.589073 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.589597 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.590288 4842 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.627561 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.627745 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.627804 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.627856 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.627923 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.628021 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.628600 4842 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.628650 4842 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:29 crc kubenswrapper[4842]: I0311 18:54:29.628672 4842 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.014094 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.016657 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04" exitCode=0 Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.016757 4842 scope.go:117] "RemoveContainer" containerID="0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.016862 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.019420 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f2484672-71b2-46e3-9ede-780d3e1aaafc","Type":"ContainerDied","Data":"f292ee96cf61fc1cb5a75c6659fcaab1de9d9fbfa199ed3627e69b48e2eacf32"} Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.019498 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f292ee96cf61fc1cb5a75c6659fcaab1de9d9fbfa199ed3627e69b48e2eacf32" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.019561 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.034944 4842 scope.go:117] "RemoveContainer" containerID="72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.043151 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.044411 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.044803 4842 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.045056 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.045307 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.045500 4842 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.051252 4842 scope.go:117] "RemoveContainer" containerID="bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.068733 4842 scope.go:117] "RemoveContainer" containerID="d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.086172 4842 scope.go:117] "RemoveContainer" containerID="88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.102593 4842 scope.go:117] "RemoveContainer" containerID="8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.120799 4842 scope.go:117] "RemoveContainer" containerID="0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33" Mar 11 18:54:30 crc kubenswrapper[4842]: E0311 18:54:30.121328 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\": container with ID starting with 0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33 not found: ID does not exist" containerID="0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.121385 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33"} err="failed to get container status \"0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\": rpc error: code = NotFound desc = could not find container \"0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33\": container with ID starting with 0e1d302aa0d58c3d643f70fc6e13b996937b6d78e509c10b2b97cd8061f8ff33 not found: ID does not exist" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.121411 4842 scope.go:117] "RemoveContainer" containerID="72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541" Mar 11 18:54:30 crc kubenswrapper[4842]: E0311 18:54:30.121836 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\": container with ID starting with 72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541 not found: ID does not exist" containerID="72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.121876 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541"} err="failed to get container status \"72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\": rpc error: code = NotFound desc = could not find container \"72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541\": container with ID starting with 72b488c2a99b4f31218084273fee209dbdfe61c77729b9f113143ed0a7586541 not found: ID does not exist" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.121918 4842 scope.go:117] "RemoveContainer" containerID="bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a" Mar 11 18:54:30 crc kubenswrapper[4842]: E0311 18:54:30.123512 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\": container with ID starting with bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a not found: ID does not exist" containerID="bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.124261 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a"} err="failed to get container status \"bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\": rpc error: code = NotFound desc = could not find container \"bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a\": container with ID starting with bcd13e5ced588b74559d32c70b2cb61e0a8e907dc8ff2f41c0362e49356d7c8a not found: ID does not exist" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.124360 4842 scope.go:117] "RemoveContainer" containerID="d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301" Mar 11 18:54:30 crc kubenswrapper[4842]: E0311 18:54:30.124828 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\": container with ID starting with d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301 not found: ID does not exist" containerID="d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.124862 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301"} err="failed to get container status \"d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\": rpc error: code = NotFound desc = could not find container \"d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301\": container with ID starting with d5b978c61e9160eec748636699c90418380fd24308421d416ab48e4e8a5d6301 not found: ID does not exist" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.124877 4842 scope.go:117] "RemoveContainer" containerID="88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04" Mar 11 18:54:30 crc kubenswrapper[4842]: E0311 18:54:30.125240 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\": container with ID starting with 88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04 not found: ID does not exist" containerID="88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.125284 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04"} err="failed to get container status \"88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\": rpc error: code = NotFound desc = could not find container \"88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04\": container with ID starting with 88f6968a9169f536022505502a30d7a2c05a56306af51c9b041b852b83fc7a04 not found: ID does not exist" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.125299 4842 scope.go:117] "RemoveContainer" containerID="8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be" Mar 11 18:54:30 crc kubenswrapper[4842]: E0311 18:54:30.125560 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\": container with ID starting with 8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be not found: ID does not exist" containerID="8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.125593 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be"} err="failed to get container status \"8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\": rpc error: code = NotFound desc = could not find container \"8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be\": container with ID starting with 8d81d291e4647c7ec9075e54625e8234d75e77d812da492666cf306ce9ca29be not found: ID does not exist" Mar 11 18:54:30 crc kubenswrapper[4842]: I0311 18:54:30.970575 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 11 18:54:31 crc kubenswrapper[4842]: E0311 18:54:31.923984 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.251:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189bde455064f962 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:54:27.583326562 +0000 UTC m=+313.231022842,LastTimestamp:2026-03-11 18:54:27.583326562 +0000 UTC m=+313.231022842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:54:34 crc kubenswrapper[4842]: I0311 18:54:34.967883 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:34 crc kubenswrapper[4842]: I0311 18:54:34.969033 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:36 crc kubenswrapper[4842]: E0311 18:54:36.458993 4842 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:36 crc kubenswrapper[4842]: E0311 18:54:36.460258 4842 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:36 crc kubenswrapper[4842]: E0311 18:54:36.460635 4842 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:36 crc kubenswrapper[4842]: E0311 18:54:36.460939 4842 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:36 crc kubenswrapper[4842]: E0311 18:54:36.461317 4842 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:36 crc kubenswrapper[4842]: I0311 18:54:36.461349 4842 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 11 18:54:36 crc kubenswrapper[4842]: E0311 18:54:36.461669 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="200ms" Mar 11 18:54:36 crc kubenswrapper[4842]: E0311 18:54:36.663906 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="400ms" Mar 11 18:54:37 crc kubenswrapper[4842]: E0311 18:54:37.065128 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="800ms" Mar 11 18:54:37 crc kubenswrapper[4842]: E0311 18:54:37.866385 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="1.6s" Mar 11 18:54:39 crc kubenswrapper[4842]: E0311 18:54:39.467862 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.251:6443: connect: connection refused" interval="3.2s" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.112809 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.115136 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.115214 4842 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd" exitCode=1 Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.115262 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd"} Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.116118 4842 scope.go:117] "RemoveContainer" containerID="f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.116528 4842 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.117221 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.117721 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.962263 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.964164 4842 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.964951 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.965938 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.977897 4842 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.977932 4842 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:40 crc kubenswrapper[4842]: E0311 18:54:40.978455 4842 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:40 crc kubenswrapper[4842]: I0311 18:54:40.978990 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:41 crc kubenswrapper[4842]: W0311 18:54:41.006655 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e8c97e9416822baa0f337e87202652cd2638f039ff39dc95be789bf6e39704b2 WatchSource:0}: Error finding container e8c97e9416822baa0f337e87202652cd2638f039ff39dc95be789bf6e39704b2: Status 404 returned error can't find the container with id e8c97e9416822baa0f337e87202652cd2638f039ff39dc95be789bf6e39704b2 Mar 11 18:54:41 crc kubenswrapper[4842]: I0311 18:54:41.124737 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 18:54:41 crc kubenswrapper[4842]: I0311 18:54:41.126174 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 18:54:41 crc kubenswrapper[4842]: I0311 18:54:41.126262 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"22c1f797663260326b0026e9db3131e95c7e2adfca3641e41ac9ec05f666b8a7"} Mar 11 18:54:41 crc kubenswrapper[4842]: I0311 18:54:41.127081 4842 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:41 crc kubenswrapper[4842]: I0311 18:54:41.127512 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:41 crc kubenswrapper[4842]: I0311 18:54:41.127804 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:41 crc kubenswrapper[4842]: I0311 18:54:41.128357 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e8c97e9416822baa0f337e87202652cd2638f039ff39dc95be789bf6e39704b2"} Mar 11 18:54:41 crc kubenswrapper[4842]: E0311 18:54:41.926507 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.251:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189bde455064f962 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 18:54:27.583326562 +0000 UTC m=+313.231022842,LastTimestamp:2026-03-11 18:54:27.583326562 +0000 UTC m=+313.231022842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.138544 4842 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="827fe13aec0c4b483ac9c224f05306f580eb4d2fe6780d0be5aa9778e98b978b" exitCode=0 Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.138682 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"827fe13aec0c4b483ac9c224f05306f580eb4d2fe6780d0be5aa9778e98b978b"} Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.139086 4842 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.139130 4842 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.139718 4842 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:42 crc kubenswrapper[4842]: E0311 18:54:42.139905 4842 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.140149 4842 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.140649 4842 status_manager.go:851] "Failed to get status for pod" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.251:6443: connect: connection refused" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.349945 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" podUID="148bd39e-58ee-4a7f-aa9c-8435ab50d862" containerName="oauth-openshift" containerID="cri-o://3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2" gracePeriod=15 Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.785324 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.785930 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.785989 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 11 18:54:42 crc kubenswrapper[4842]: I0311 18:54:42.929661 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067677 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-provider-selection\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067770 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-trusted-ca-bundle\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067825 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-error\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067846 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58rjr\" (UniqueName: \"kubernetes.io/projected/148bd39e-58ee-4a7f-aa9c-8435ab50d862-kube-api-access-58rjr\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067864 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-service-ca\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067897 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-cliconfig\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067915 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-serving-cert\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067935 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-policies\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.067951 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-dir\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.068114 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.068720 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-idp-0-file-data\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.068766 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-login\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.068790 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-router-certs\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.068939 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.069022 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-ocp-branding-template\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.069096 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-session\") pod \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\" (UID: \"148bd39e-58ee-4a7f-aa9c-8435ab50d862\") " Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.069757 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.069825 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.069845 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.070401 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.070416 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.070426 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.070436 4842 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.070446 4842 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/148bd39e-58ee-4a7f-aa9c-8435ab50d862-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.075093 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.076796 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.077302 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.081411 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.081506 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148bd39e-58ee-4a7f-aa9c-8435ab50d862-kube-api-access-58rjr" (OuterVolumeSpecName: "kube-api-access-58rjr") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "kube-api-access-58rjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.081707 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.081914 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.082218 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.082466 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "148bd39e-58ee-4a7f-aa9c-8435ab50d862" (UID: "148bd39e-58ee-4a7f-aa9c-8435ab50d862"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.146152 4842 generic.go:334] "Generic (PLEG): container finished" podID="148bd39e-58ee-4a7f-aa9c-8435ab50d862" containerID="3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2" exitCode=0 Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.146314 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" event={"ID":"148bd39e-58ee-4a7f-aa9c-8435ab50d862","Type":"ContainerDied","Data":"3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2"} Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.146340 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" event={"ID":"148bd39e-58ee-4a7f-aa9c-8435ab50d862","Type":"ContainerDied","Data":"158c32959cf9c93533c19c03164fc807e428b1454af9e711a0588b231b7ed290"} Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.146356 4842 scope.go:117] "RemoveContainer" containerID="3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.146438 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2flrr" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.156637 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1ae820b2eb57feeefd4f7bb42f70d32392d030fea5be7a359f29ed48ff33b8b6"} Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.156686 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"65ae505d542410c6e4503df0c0e7f546eb241181b744d7b674edc90d920f715d"} Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.156699 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"566585b9dd488c7e25e9156f3d81b14bc6231b5794e7674deb090fd868f6c499"} Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.171916 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.171952 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.171963 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.171975 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58rjr\" (UniqueName: \"kubernetes.io/projected/148bd39e-58ee-4a7f-aa9c-8435ab50d862-kube-api-access-58rjr\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.171984 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.171992 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.172004 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.172015 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.172035 4842 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/148bd39e-58ee-4a7f-aa9c-8435ab50d862-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.177840 4842 scope.go:117] "RemoveContainer" containerID="3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2" Mar 11 18:54:43 crc kubenswrapper[4842]: E0311 18:54:43.178655 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2\": container with ID starting with 3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2 not found: ID does not exist" containerID="3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2" Mar 11 18:54:43 crc kubenswrapper[4842]: I0311 18:54:43.178686 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2"} err="failed to get container status \"3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2\": rpc error: code = NotFound desc = could not find container \"3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2\": container with ID starting with 3676ce72f0a701f8ab094941c223b77926b60067da88452f4fa457d10f28ece2 not found: ID does not exist" Mar 11 18:54:44 crc kubenswrapper[4842]: I0311 18:54:44.168104 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fdb57b062f7b4e940ef53718ea5b559d6b0954b28f0850462d1dc3e247649dfe"} Mar 11 18:54:44 crc kubenswrapper[4842]: I0311 18:54:44.168511 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"526d9946ed1574e64b2449b134b40821c3de14792ef817cf7473745de8de8478"} Mar 11 18:54:44 crc kubenswrapper[4842]: I0311 18:54:44.168533 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:44 crc kubenswrapper[4842]: I0311 18:54:44.168669 4842 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:44 crc kubenswrapper[4842]: I0311 18:54:44.168702 4842 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:45 crc kubenswrapper[4842]: I0311 18:54:45.980059 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:45 crc kubenswrapper[4842]: I0311 18:54:45.980102 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:45 crc kubenswrapper[4842]: I0311 18:54:45.987802 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:46 crc kubenswrapper[4842]: I0311 18:54:46.733662 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:54:49 crc kubenswrapper[4842]: I0311 18:54:49.178949 4842 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:49 crc kubenswrapper[4842]: I0311 18:54:49.203777 4842 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:49 crc kubenswrapper[4842]: I0311 18:54:49.204074 4842 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:49 crc kubenswrapper[4842]: I0311 18:54:49.208377 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:54:49 crc kubenswrapper[4842]: I0311 18:54:49.210692 4842 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a8417f20-01d1-4795-b3c9-845bb86ac15d" Mar 11 18:54:50 crc kubenswrapper[4842]: I0311 18:54:50.209191 4842 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:50 crc kubenswrapper[4842]: I0311 18:54:50.209611 4842 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:54:52 crc kubenswrapper[4842]: I0311 18:54:52.785854 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 11 18:54:52 crc kubenswrapper[4842]: I0311 18:54:52.786150 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.108668 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.108739 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.110651 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.121964 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.139502 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.139537 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.182043 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:54:53 crc kubenswrapper[4842]: I0311 18:54:53.194654 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 18:54:54 crc kubenswrapper[4842]: I0311 18:54:54.237964 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f04c828f9096ada5ce353397bf8595fd8749e997b496d023ec35dacea21dc49a"} Mar 11 18:54:54 crc kubenswrapper[4842]: I0311 18:54:54.238348 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d2182370964a7ecfa4255a0ab80003845f07d5308cfd572d1276aebcff7bb85f"} Mar 11 18:54:54 crc kubenswrapper[4842]: I0311 18:54:54.239844 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"01109bacdc1ffb9b43eaaa3ff050c3008f3fb1364808d530e52a875e8714f8f4"} Mar 11 18:54:54 crc kubenswrapper[4842]: I0311 18:54:54.239881 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ca12dad7d6046a6474a5da78d55c25b3888047110ed13b74d7199d0497d55176"} Mar 11 18:54:54 crc kubenswrapper[4842]: I0311 18:54:54.240366 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:54:54 crc kubenswrapper[4842]: I0311 18:54:54.999648 4842 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a8417f20-01d1-4795-b3c9-845bb86ac15d" Mar 11 18:54:55 crc kubenswrapper[4842]: I0311 18:54:55.246708 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 11 18:54:55 crc kubenswrapper[4842]: I0311 18:54:55.246768 4842 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="f04c828f9096ada5ce353397bf8595fd8749e997b496d023ec35dacea21dc49a" exitCode=255 Mar 11 18:54:55 crc kubenswrapper[4842]: I0311 18:54:55.247065 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"f04c828f9096ada5ce353397bf8595fd8749e997b496d023ec35dacea21dc49a"} Mar 11 18:54:55 crc kubenswrapper[4842]: I0311 18:54:55.247314 4842 scope.go:117] "RemoveContainer" containerID="f04c828f9096ada5ce353397bf8595fd8749e997b496d023ec35dacea21dc49a" Mar 11 18:54:56 crc kubenswrapper[4842]: I0311 18:54:56.257292 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 18:54:56 crc kubenswrapper[4842]: I0311 18:54:56.258346 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 11 18:54:56 crc kubenswrapper[4842]: I0311 18:54:56.258456 4842 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="c40f2428efe5fc97a23c24626bde2dbdb8426b5d6c960966c7c12c4a41195cbc" exitCode=255 Mar 11 18:54:56 crc kubenswrapper[4842]: I0311 18:54:56.258545 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"c40f2428efe5fc97a23c24626bde2dbdb8426b5d6c960966c7c12c4a41195cbc"} Mar 11 18:54:56 crc kubenswrapper[4842]: I0311 18:54:56.258628 4842 scope.go:117] "RemoveContainer" containerID="f04c828f9096ada5ce353397bf8595fd8749e997b496d023ec35dacea21dc49a" Mar 11 18:54:56 crc kubenswrapper[4842]: I0311 18:54:56.259240 4842 scope.go:117] "RemoveContainer" containerID="c40f2428efe5fc97a23c24626bde2dbdb8426b5d6c960966c7c12c4a41195cbc" Mar 11 18:54:56 crc kubenswrapper[4842]: E0311 18:54:56.259638 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:54:57 crc kubenswrapper[4842]: I0311 18:54:57.264954 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 18:54:59 crc kubenswrapper[4842]: I0311 18:54:59.225196 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 18:54:59 crc kubenswrapper[4842]: I0311 18:54:59.714121 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 18:54:59 crc kubenswrapper[4842]: I0311 18:54:59.729823 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 18:54:59 crc kubenswrapper[4842]: I0311 18:54:59.754721 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 18:54:59 crc kubenswrapper[4842]: I0311 18:54:59.899067 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 18:54:59 crc kubenswrapper[4842]: I0311 18:54:59.945974 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.137556 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.182449 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.183553 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.243765 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.368889 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.480740 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.526113 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 18:55:00 crc kubenswrapper[4842]: I0311 18:55:00.734133 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.113165 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.140804 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.195785 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.268989 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.411146 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.460190 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.462840 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.630719 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.674461 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.738122 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.739191 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.895842 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 18:55:01 crc kubenswrapper[4842]: I0311 18:55:01.962537 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.115399 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.203079 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.242718 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.449366 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.556669 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.565693 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.674691 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.691455 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.785938 4842 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.786025 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.786097 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.786962 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"22c1f797663260326b0026e9db3131e95c7e2adfca3641e41ac9ec05f666b8a7"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.787149 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://22c1f797663260326b0026e9db3131e95c7e2adfca3641e41ac9ec05f666b8a7" gracePeriod=30 Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.812943 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.841130 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.867038 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 18:55:02 crc kubenswrapper[4842]: I0311 18:55:02.984016 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.008963 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.079487 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.089762 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.118982 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.155665 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.200912 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.224256 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.241572 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.243197 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.366336 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.372019 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.402167 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.408318 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.475756 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.507926 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.527914 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.556686 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.612439 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.646777 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.681385 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.820139 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.856564 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 18:55:03 crc kubenswrapper[4842]: I0311 18:55:03.994868 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.093493 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.102108 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.225135 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.225809 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.288182 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.350126 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.367011 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.401476 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.490151 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.666535 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.724880 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.793019 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.793235 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.838678 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 18:55:04 crc kubenswrapper[4842]: I0311 18:55:04.980501 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.042985 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.197584 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.256989 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.285799 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.426113 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.543197 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.567415 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.590221 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.617763 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.623797 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.689164 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.775840 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.844967 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 18:55:05 crc kubenswrapper[4842]: I0311 18:55:05.958010 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.060202 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.089559 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.221300 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.249478 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.508537 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.592379 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.612975 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.624502 4842 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.683040 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.726420 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.819085 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.881520 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.896028 4842 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 18:55:06 crc kubenswrapper[4842]: I0311 18:55:06.914514 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.055951 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.059865 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.133857 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.149944 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.255436 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.261231 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.347727 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.462395 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.482683 4842 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.487564 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.487543385 podStartE2EDuration="40.487543385s" podCreationTimestamp="2026-03-11 18:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:54:48.971579479 +0000 UTC m=+334.619275759" watchObservedRunningTime="2026-03-11 18:55:07.487543385 +0000 UTC m=+353.135239675" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.487938 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-2flrr"] Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.487992 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7c4995446c-xhrg2","openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 18:55:07 crc kubenswrapper[4842]: E0311 18:55:07.488177 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148bd39e-58ee-4a7f-aa9c-8435ab50d862" containerName="oauth-openshift" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488196 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="148bd39e-58ee-4a7f-aa9c-8435ab50d862" containerName="oauth-openshift" Mar 11 18:55:07 crc kubenswrapper[4842]: E0311 18:55:07.488228 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" containerName="installer" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488237 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" containerName="installer" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488341 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488376 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="148bd39e-58ee-4a7f-aa9c-8435ab50d862" containerName="oauth-openshift" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488394 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2484672-71b2-46e3-9ede-780d3e1aaafc" containerName="installer" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488371 4842 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488556 4842 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="21686ceb-e0dd-49aa-9397-dea4bac2e26c" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.488817 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.494786 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.497563 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.497765 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.498513 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.498570 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.498606 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.498644 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.498678 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d17b8d08-f704-4b56-a4e4-2322e270afc5-audit-dir\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.498741 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-session\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.498950 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499029 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499076 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499166 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499201 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-audit-policies\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499245 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499320 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499356 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499363 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499391 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxd2b\" (UniqueName: \"kubernetes.io/projected/d17b8d08-f704-4b56-a4e4-2322e270afc5-kube-api-access-dxd2b\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499404 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499415 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499429 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499470 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499556 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.499834 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.500047 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.502672 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.502850 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.510514 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.519577 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.521977 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.522764 4842 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.527972 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.547629 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.547611912 podStartE2EDuration="18.547611912s" podCreationTimestamp="2026-03-11 18:54:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:55:07.544151933 +0000 UTC m=+353.191848213" watchObservedRunningTime="2026-03-11 18:55:07.547611912 +0000 UTC m=+353.195308222" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.600223 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.600854 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.600985 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.601116 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-audit-policies\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.601217 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.601339 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.601526 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.601642 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxd2b\" (UniqueName: \"kubernetes.io/projected/d17b8d08-f704-4b56-a4e4-2322e270afc5-kube-api-access-dxd2b\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.601799 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.602296 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.602005 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-audit-policies\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.602597 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.602776 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.602874 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.602972 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.603176 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d17b8d08-f704-4b56-a4e4-2322e270afc5-audit-dir\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.603343 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-session\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.603683 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d17b8d08-f704-4b56-a4e4-2322e270afc5-audit-dir\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.603699 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.615883 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-session\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.616476 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-login\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.616636 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.616767 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-error\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.617237 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.617657 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.618044 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.618545 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d17b8d08-f704-4b56-a4e4-2322e270afc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.628907 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxd2b\" (UniqueName: \"kubernetes.io/projected/d17b8d08-f704-4b56-a4e4-2322e270afc5-kube-api-access-dxd2b\") pod \"oauth-openshift-7c4995446c-xhrg2\" (UID: \"d17b8d08-f704-4b56-a4e4-2322e270afc5\") " pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.691958 4842 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.715808 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.815962 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.906672 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.934171 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 18:55:07 crc kubenswrapper[4842]: I0311 18:55:07.944531 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.056999 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.103298 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.186740 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.508296 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.537988 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.566722 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.628476 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.636173 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.640932 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.709719 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.724095 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.729089 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.763659 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.825199 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 18:55:08 crc kubenswrapper[4842]: I0311 18:55:08.973210 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148bd39e-58ee-4a7f-aa9c-8435ab50d862" path="/var/lib/kubelet/pods/148bd39e-58ee-4a7f-aa9c-8435ab50d862/volumes" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.017586 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.026487 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.110777 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.168226 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.181902 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.206768 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.217688 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.372847 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.378422 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.384560 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.418096 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.443504 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.554317 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.587875 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.592243 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.638402 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.719112 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.722166 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.788263 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.838383 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.863230 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.882217 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.901381 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.961672 4842 scope.go:117] "RemoveContainer" containerID="c40f2428efe5fc97a23c24626bde2dbdb8426b5d6c960966c7c12c4a41195cbc" Mar 11 18:55:09 crc kubenswrapper[4842]: I0311 18:55:09.986940 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.013177 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.041658 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.108672 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.184451 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.227093 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.233627 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.234652 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.243172 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.338683 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.338739 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f0cc22d02dc8ef750412ff4213be355db9bea4564a8c7fd12cfb5dba37fc7a0e"} Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.352536 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.379156 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.440000 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.451085 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.524665 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.578472 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.611181 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.813057 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.925173 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 18:55:10 crc kubenswrapper[4842]: I0311 18:55:10.947517 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.036778 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.094673 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.175146 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.262323 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.278774 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.292134 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.349117 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.349880 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.349948 4842 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="f0cc22d02dc8ef750412ff4213be355db9bea4564a8c7fd12cfb5dba37fc7a0e" exitCode=255 Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.349989 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"f0cc22d02dc8ef750412ff4213be355db9bea4564a8c7fd12cfb5dba37fc7a0e"} Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.350036 4842 scope.go:117] "RemoveContainer" containerID="c40f2428efe5fc97a23c24626bde2dbdb8426b5d6c960966c7c12c4a41195cbc" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.350714 4842 scope.go:117] "RemoveContainer" containerID="f0cc22d02dc8ef750412ff4213be355db9bea4564a8c7fd12cfb5dba37fc7a0e" Mar 11 18:55:11 crc kubenswrapper[4842]: E0311 18:55:11.351098 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.380456 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.404950 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.649833 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.654043 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.684973 4842 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.685358 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1" gracePeriod=5 Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.775468 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.841469 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.921204 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.923092 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.928550 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 18:55:11 crc kubenswrapper[4842]: I0311 18:55:11.998358 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.088224 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.090548 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.126391 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.247857 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.314005 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.356928 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.370705 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.401489 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c4995446c-xhrg2"] Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.490954 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.806095 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.808427 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.881934 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.908890 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7c4995446c-xhrg2"] Mar 11 18:55:12 crc kubenswrapper[4842]: W0311 18:55:12.921722 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd17b8d08_f704_4b56_a4e4_2322e270afc5.slice/crio-646a62f853aecbedd9477ef4362c83b4ac4e806866b08fa378d90d9f66c5b6c2 WatchSource:0}: Error finding container 646a62f853aecbedd9477ef4362c83b4ac4e806866b08fa378d90d9f66c5b6c2: Status 404 returned error can't find the container with id 646a62f853aecbedd9477ef4362c83b4ac4e806866b08fa378d90d9f66c5b6c2 Mar 11 18:55:12 crc kubenswrapper[4842]: I0311 18:55:12.953260 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.020479 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.035866 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.067148 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.096811 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.117700 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.164160 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.190576 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.336885 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.363613 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" event={"ID":"d17b8d08-f704-4b56-a4e4-2322e270afc5","Type":"ContainerStarted","Data":"83a5b7024f912a9b43a5b830f07eaa13765b0eb75ab8332c77af450e0a46f434"} Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.363658 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" event={"ID":"d17b8d08-f704-4b56-a4e4-2322e270afc5","Type":"ContainerStarted","Data":"646a62f853aecbedd9477ef4362c83b4ac4e806866b08fa378d90d9f66c5b6c2"} Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.364765 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.563589 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.563784 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.608181 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.628227 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7c4995446c-xhrg2" podStartSLOduration=56.628210391 podStartE2EDuration="56.628210391s" podCreationTimestamp="2026-03-11 18:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:55:13.590435992 +0000 UTC m=+359.238132272" watchObservedRunningTime="2026-03-11 18:55:13.628210391 +0000 UTC m=+359.275906681" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.837805 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 18:55:13 crc kubenswrapper[4842]: I0311 18:55:13.928197 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.061821 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.342778 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.386436 4842 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.397363 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.441703 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.482525 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.703538 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.798554 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.833225 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.907995 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.932942 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 18:55:14 crc kubenswrapper[4842]: I0311 18:55:14.996656 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 18:55:15 crc kubenswrapper[4842]: I0311 18:55:15.124312 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 18:55:15 crc kubenswrapper[4842]: I0311 18:55:15.307347 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 18:55:15 crc kubenswrapper[4842]: I0311 18:55:15.524664 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.289464 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.289559 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.388551 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.388636 4842 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1" exitCode=137 Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.388714 4842 scope.go:117] "RemoveContainer" containerID="4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.388747 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.410963 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411020 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411133 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411195 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411251 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411357 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411615 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411638 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.411688 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.412203 4842 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.412246 4842 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.412325 4842 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.412358 4842 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.420692 4842 scope.go:117] "RemoveContainer" containerID="4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1" Mar 11 18:55:17 crc kubenswrapper[4842]: E0311 18:55:17.421575 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1\": container with ID starting with 4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1 not found: ID does not exist" containerID="4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.421642 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1"} err="failed to get container status \"4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1\": rpc error: code = NotFound desc = could not find container \"4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1\": container with ID starting with 4a6c74179dbac005960cb56fdad8009d73ebf2279105bc5ded87907a5eea05a1 not found: ID does not exist" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.426914 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 18:55:17 crc kubenswrapper[4842]: I0311 18:55:17.513578 4842 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 18:55:17 crc kubenswrapper[4842]: E0311 18:55:17.759114 4842 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice\": RecentStats: unable to find data in memory cache]" Mar 11 18:55:18 crc kubenswrapper[4842]: I0311 18:55:18.973035 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 11 18:55:18 crc kubenswrapper[4842]: I0311 18:55:18.973941 4842 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 11 18:55:18 crc kubenswrapper[4842]: I0311 18:55:18.988700 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 18:55:18 crc kubenswrapper[4842]: I0311 18:55:18.988752 4842 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="afea1329-fa6c-4ecf-8a37-e21f5f6d08c0" Mar 11 18:55:18 crc kubenswrapper[4842]: I0311 18:55:18.995922 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 18:55:18 crc kubenswrapper[4842]: I0311 18:55:18.995985 4842 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="afea1329-fa6c-4ecf-8a37-e21f5f6d08c0" Mar 11 18:55:21 crc kubenswrapper[4842]: I0311 18:55:21.962724 4842 scope.go:117] "RemoveContainer" containerID="f0cc22d02dc8ef750412ff4213be355db9bea4564a8c7fd12cfb5dba37fc7a0e" Mar 11 18:55:21 crc kubenswrapper[4842]: E0311 18:55:21.963165 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 18:55:23 crc kubenswrapper[4842]: I0311 18:55:23.189942 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 18:55:33 crc kubenswrapper[4842]: I0311 18:55:33.498609 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 11 18:55:33 crc kubenswrapper[4842]: I0311 18:55:33.501481 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 18:55:33 crc kubenswrapper[4842]: I0311 18:55:33.502701 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 18:55:33 crc kubenswrapper[4842]: I0311 18:55:33.502745 4842 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="22c1f797663260326b0026e9db3131e95c7e2adfca3641e41ac9ec05f666b8a7" exitCode=137 Mar 11 18:55:33 crc kubenswrapper[4842]: I0311 18:55:33.502777 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"22c1f797663260326b0026e9db3131e95c7e2adfca3641e41ac9ec05f666b8a7"} Mar 11 18:55:33 crc kubenswrapper[4842]: I0311 18:55:33.502814 4842 scope.go:117] "RemoveContainer" containerID="f00259a02b49db7f35c2ce386d96e40dcfdbc2a17c6637ca6e73b27b06143cbd" Mar 11 18:55:34 crc kubenswrapper[4842]: I0311 18:55:34.514373 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 11 18:55:34 crc kubenswrapper[4842]: I0311 18:55:34.516912 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 18:55:34 crc kubenswrapper[4842]: I0311 18:55:34.518050 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7f4e4239fa30327982fc733c62a0bc69da74d5dcdabcaeb1dd1905c6b209626b"} Mar 11 18:55:36 crc kubenswrapper[4842]: I0311 18:55:36.531578 4842 generic.go:334] "Generic (PLEG): container finished" podID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerID="120d6c84408fa4e203d1a983cc8e45f9fcbaadc5eba377764a5c0715f2e1865c" exitCode=0 Mar 11 18:55:36 crc kubenswrapper[4842]: I0311 18:55:36.531698 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" event={"ID":"6e8e2825-2a37-4731-bc73-4e469bc34334","Type":"ContainerDied","Data":"120d6c84408fa4e203d1a983cc8e45f9fcbaadc5eba377764a5c0715f2e1865c"} Mar 11 18:55:36 crc kubenswrapper[4842]: I0311 18:55:36.532189 4842 scope.go:117] "RemoveContainer" containerID="120d6c84408fa4e203d1a983cc8e45f9fcbaadc5eba377764a5c0715f2e1865c" Mar 11 18:55:36 crc kubenswrapper[4842]: I0311 18:55:36.732661 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:55:36 crc kubenswrapper[4842]: I0311 18:55:36.961794 4842 scope.go:117] "RemoveContainer" containerID="f0cc22d02dc8ef750412ff4213be355db9bea4564a8c7fd12cfb5dba37fc7a0e" Mar 11 18:55:37 crc kubenswrapper[4842]: I0311 18:55:37.539146 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" event={"ID":"6e8e2825-2a37-4731-bc73-4e469bc34334","Type":"ContainerStarted","Data":"0248cba12a4f8cac02bad816568b52d9150bf94c7a949fc3eaa3c1d571f46065"} Mar 11 18:55:37 crc kubenswrapper[4842]: I0311 18:55:37.539950 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:55:37 crc kubenswrapper[4842]: I0311 18:55:37.541685 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 11 18:55:37 crc kubenswrapper[4842]: I0311 18:55:37.541746 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"17e18e6c9c64db61b6f14d593d81b55f25af63c3a5b773059df05fbb00cb16f8"} Mar 11 18:55:37 crc kubenswrapper[4842]: I0311 18:55:37.546252 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:55:42 crc kubenswrapper[4842]: I0311 18:55:42.786126 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:55:42 crc kubenswrapper[4842]: I0311 18:55:42.791800 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:55:43 crc kubenswrapper[4842]: I0311 18:55:43.584870 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 18:55:46 crc kubenswrapper[4842]: I0311 18:55:46.436514 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 18:55:52 crc kubenswrapper[4842]: I0311 18:55:52.838701 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.160367 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554256-mx227"] Mar 11 18:56:00 crc kubenswrapper[4842]: E0311 18:56:00.161132 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.161148 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.161343 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.162036 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554256-mx227" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.164547 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.165424 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.165733 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.171109 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554256-mx227"] Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.319626 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtksr\" (UniqueName: \"kubernetes.io/projected/ad25a877-2509-4417-8ece-9413e28450a3-kube-api-access-gtksr\") pod \"auto-csr-approver-29554256-mx227\" (UID: \"ad25a877-2509-4417-8ece-9413e28450a3\") " pod="openshift-infra/auto-csr-approver-29554256-mx227" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.421255 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtksr\" (UniqueName: \"kubernetes.io/projected/ad25a877-2509-4417-8ece-9413e28450a3-kube-api-access-gtksr\") pod \"auto-csr-approver-29554256-mx227\" (UID: \"ad25a877-2509-4417-8ece-9413e28450a3\") " pod="openshift-infra/auto-csr-approver-29554256-mx227" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.441972 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtksr\" (UniqueName: \"kubernetes.io/projected/ad25a877-2509-4417-8ece-9413e28450a3-kube-api-access-gtksr\") pod \"auto-csr-approver-29554256-mx227\" (UID: \"ad25a877-2509-4417-8ece-9413e28450a3\") " pod="openshift-infra/auto-csr-approver-29554256-mx227" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.491061 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554256-mx227" Mar 11 18:56:00 crc kubenswrapper[4842]: I0311 18:56:00.861318 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554256-mx227"] Mar 11 18:56:01 crc kubenswrapper[4842]: I0311 18:56:01.471705 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:56:01 crc kubenswrapper[4842]: I0311 18:56:01.472070 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 18:56:01 crc kubenswrapper[4842]: I0311 18:56:01.683001 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554256-mx227" event={"ID":"ad25a877-2509-4417-8ece-9413e28450a3","Type":"ContainerStarted","Data":"6753ce0e453658e13c44b63bba76c3ddbfac77b04658784fd88ff7a62a4278d9"} Mar 11 18:56:02 crc kubenswrapper[4842]: I0311 18:56:02.690715 4842 generic.go:334] "Generic (PLEG): container finished" podID="ad25a877-2509-4417-8ece-9413e28450a3" containerID="f98f6cbf392e6bf1410f92338635f01783785262d95b32645664783ad01b3511" exitCode=0 Mar 11 18:56:02 crc kubenswrapper[4842]: I0311 18:56:02.690782 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554256-mx227" event={"ID":"ad25a877-2509-4417-8ece-9413e28450a3","Type":"ContainerDied","Data":"f98f6cbf392e6bf1410f92338635f01783785262d95b32645664783ad01b3511"} Mar 11 18:56:03 crc kubenswrapper[4842]: I0311 18:56:03.954694 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554256-mx227" Mar 11 18:56:04 crc kubenswrapper[4842]: I0311 18:56:04.059646 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtksr\" (UniqueName: \"kubernetes.io/projected/ad25a877-2509-4417-8ece-9413e28450a3-kube-api-access-gtksr\") pod \"ad25a877-2509-4417-8ece-9413e28450a3\" (UID: \"ad25a877-2509-4417-8ece-9413e28450a3\") " Mar 11 18:56:04 crc kubenswrapper[4842]: I0311 18:56:04.065936 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad25a877-2509-4417-8ece-9413e28450a3-kube-api-access-gtksr" (OuterVolumeSpecName: "kube-api-access-gtksr") pod "ad25a877-2509-4417-8ece-9413e28450a3" (UID: "ad25a877-2509-4417-8ece-9413e28450a3"). InnerVolumeSpecName "kube-api-access-gtksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:56:04 crc kubenswrapper[4842]: I0311 18:56:04.161876 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtksr\" (UniqueName: \"kubernetes.io/projected/ad25a877-2509-4417-8ece-9413e28450a3-kube-api-access-gtksr\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:04 crc kubenswrapper[4842]: I0311 18:56:04.701472 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554256-mx227" event={"ID":"ad25a877-2509-4417-8ece-9413e28450a3","Type":"ContainerDied","Data":"6753ce0e453658e13c44b63bba76c3ddbfac77b04658784fd88ff7a62a4278d9"} Mar 11 18:56:04 crc kubenswrapper[4842]: I0311 18:56:04.701758 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6753ce0e453658e13c44b63bba76c3ddbfac77b04658784fd88ff7a62a4278d9" Mar 11 18:56:04 crc kubenswrapper[4842]: I0311 18:56:04.701585 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554256-mx227" Mar 11 18:56:31 crc kubenswrapper[4842]: I0311 18:56:31.471580 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:56:31 crc kubenswrapper[4842]: I0311 18:56:31.472182 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.313908 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hcnw9"] Mar 11 18:56:34 crc kubenswrapper[4842]: E0311 18:56:34.314543 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad25a877-2509-4417-8ece-9413e28450a3" containerName="oc" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.314566 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad25a877-2509-4417-8ece-9413e28450a3" containerName="oc" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.314729 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad25a877-2509-4417-8ece-9413e28450a3" containerName="oc" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.315366 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.329535 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hcnw9"] Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451376 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9518bab3-9c3f-45c5-b1f5-176425ea7e53-registry-certificates\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451453 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-registry-tls\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451591 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfnd9\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-kube-api-access-pfnd9\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451656 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9518bab3-9c3f-45c5-b1f5-176425ea7e53-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451696 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9518bab3-9c3f-45c5-b1f5-176425ea7e53-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451723 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9518bab3-9c3f-45c5-b1f5-176425ea7e53-trusted-ca\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451774 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.451795 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-bound-sa-token\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.486708 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.553575 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-registry-tls\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.553644 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfnd9\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-kube-api-access-pfnd9\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.553715 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9518bab3-9c3f-45c5-b1f5-176425ea7e53-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.554858 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9518bab3-9c3f-45c5-b1f5-176425ea7e53-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.554908 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9518bab3-9c3f-45c5-b1f5-176425ea7e53-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.554956 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9518bab3-9c3f-45c5-b1f5-176425ea7e53-trusted-ca\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.556093 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9518bab3-9c3f-45c5-b1f5-176425ea7e53-trusted-ca\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.554989 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-bound-sa-token\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.556205 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9518bab3-9c3f-45c5-b1f5-176425ea7e53-registry-certificates\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.557406 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9518bab3-9c3f-45c5-b1f5-176425ea7e53-registry-certificates\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.559849 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-registry-tls\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.561922 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9518bab3-9c3f-45c5-b1f5-176425ea7e53-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.568433 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-bound-sa-token\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.572375 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfnd9\" (UniqueName: \"kubernetes.io/projected/9518bab3-9c3f-45c5-b1f5-176425ea7e53-kube-api-access-pfnd9\") pod \"image-registry-66df7c8f76-hcnw9\" (UID: \"9518bab3-9c3f-45c5-b1f5-176425ea7e53\") " pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.636792 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:34 crc kubenswrapper[4842]: I0311 18:56:34.906831 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hcnw9"] Mar 11 18:56:35 crc kubenswrapper[4842]: I0311 18:56:35.882740 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" event={"ID":"9518bab3-9c3f-45c5-b1f5-176425ea7e53","Type":"ContainerStarted","Data":"b421b9e7d6bfa54c2c464d9aec8dcb7684b611b41dd60cc042f66535c757abc5"} Mar 11 18:56:35 crc kubenswrapper[4842]: I0311 18:56:35.882813 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" event={"ID":"9518bab3-9c3f-45c5-b1f5-176425ea7e53","Type":"ContainerStarted","Data":"d7dac6d962e31a4c7fec44fb16a9bbf28c96fc8aecedb25f2a47743450d61c02"} Mar 11 18:56:35 crc kubenswrapper[4842]: I0311 18:56:35.882922 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:35 crc kubenswrapper[4842]: I0311 18:56:35.900415 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" podStartSLOduration=1.900397097 podStartE2EDuration="1.900397097s" podCreationTimestamp="2026-03-11 18:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:56:35.899562193 +0000 UTC m=+441.547258463" watchObservedRunningTime="2026-03-11 18:56:35.900397097 +0000 UTC m=+441.548093377" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.532171 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg2xr"] Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.533211 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bg2xr" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="registry-server" containerID="cri-o://962be4f0ab85aa58ee0681d0cf5db1d1f3f7d476fd6e64c4fbd5be31d60cf079" gracePeriod=30 Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.542242 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vb82w"] Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.542586 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vb82w" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="registry-server" containerID="cri-o://371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0" gracePeriod=30 Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.558839 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-czxnq"] Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.559098 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" containerID="cri-o://0248cba12a4f8cac02bad816568b52d9150bf94c7a949fc3eaa3c1d571f46065" gracePeriod=30 Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.575239 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rrvh"] Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.575541 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6rrvh" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="registry-server" containerID="cri-o://e64d291da777c56c62b53f1ae235ae903161067e39a1db556eaca3a58b30eb82" gracePeriod=30 Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.586258 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nqsd8"] Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.587314 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.593129 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvszz"] Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.593343 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvszz" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="registry-server" containerID="cri-o://a50cd1d4365c3d5e1b721201d2640c6d833ccf538c623ca1d7ca0f336fa6be54" gracePeriod=30 Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.599894 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nqsd8"] Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.737523 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt65s\" (UniqueName: \"kubernetes.io/projected/e45c9752-2328-495a-88dd-a6c769b7f012-kube-api-access-mt65s\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.737631 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45c9752-2328-495a-88dd-a6c769b7f012-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.737689 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e45c9752-2328-495a-88dd-a6c769b7f012-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.838910 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45c9752-2328-495a-88dd-a6c769b7f012-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.838984 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e45c9752-2328-495a-88dd-a6c769b7f012-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.839017 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt65s\" (UniqueName: \"kubernetes.io/projected/e45c9752-2328-495a-88dd-a6c769b7f012-kube-api-access-mt65s\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.840644 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45c9752-2328-495a-88dd-a6c769b7f012-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.850039 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e45c9752-2328-495a-88dd-a6c769b7f012-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.857760 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt65s\" (UniqueName: \"kubernetes.io/projected/e45c9752-2328-495a-88dd-a6c769b7f012-kube-api-access-mt65s\") pod \"marketplace-operator-79b997595-nqsd8\" (UID: \"e45c9752-2328-495a-88dd-a6c769b7f012\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:47 crc kubenswrapper[4842]: I0311 18:56:47.908515 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.008595 4842 generic.go:334] "Generic (PLEG): container finished" podID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerID="962be4f0ab85aa58ee0681d0cf5db1d1f3f7d476fd6e64c4fbd5be31d60cf079" exitCode=0 Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.008660 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg2xr" event={"ID":"7771ebaa-648a-46c4-986c-2cea25b5b7df","Type":"ContainerDied","Data":"962be4f0ab85aa58ee0681d0cf5db1d1f3f7d476fd6e64c4fbd5be31d60cf079"} Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.008704 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg2xr" event={"ID":"7771ebaa-648a-46c4-986c-2cea25b5b7df","Type":"ContainerDied","Data":"bb665212a33380ea52581d46f3c02581cf6a50fa3e286ff73dcd41cebc8d7806"} Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.008717 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb665212a33380ea52581d46f3c02581cf6a50fa3e286ff73dcd41cebc8d7806" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.009159 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.011704 4842 generic.go:334] "Generic (PLEG): container finished" podID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerID="e64d291da777c56c62b53f1ae235ae903161067e39a1db556eaca3a58b30eb82" exitCode=0 Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.011759 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rrvh" event={"ID":"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7","Type":"ContainerDied","Data":"e64d291da777c56c62b53f1ae235ae903161067e39a1db556eaca3a58b30eb82"} Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.014391 4842 generic.go:334] "Generic (PLEG): container finished" podID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerID="0248cba12a4f8cac02bad816568b52d9150bf94c7a949fc3eaa3c1d571f46065" exitCode=0 Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.014464 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" event={"ID":"6e8e2825-2a37-4731-bc73-4e469bc34334","Type":"ContainerDied","Data":"0248cba12a4f8cac02bad816568b52d9150bf94c7a949fc3eaa3c1d571f46065"} Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.014512 4842 scope.go:117] "RemoveContainer" containerID="120d6c84408fa4e203d1a983cc8e45f9fcbaadc5eba377764a5c0715f2e1865c" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.014465 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.018490 4842 generic.go:334] "Generic (PLEG): container finished" podID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerID="371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0" exitCode=0 Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.018548 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb82w" event={"ID":"ff824009-ab02-4a23-9c8a-76bc3d6a5f04","Type":"ContainerDied","Data":"371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0"} Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.018571 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vb82w" event={"ID":"ff824009-ab02-4a23-9c8a-76bc3d6a5f04","Type":"ContainerDied","Data":"9e4ff073384f0d36d2fd8f4207a4564fc517f7d52ab24a02318d2152d777079d"} Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.018660 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vb82w" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.021262 4842 generic.go:334] "Generic (PLEG): container finished" podID="7857f1af-d426-446f-a295-05423f407554" containerID="a50cd1d4365c3d5e1b721201d2640c6d833ccf538c623ca1d7ca0f336fa6be54" exitCode=0 Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.021332 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvszz" event={"ID":"7857f1af-d426-446f-a295-05423f407554","Type":"ContainerDied","Data":"a50cd1d4365c3d5e1b721201d2640c6d833ccf538c623ca1d7ca0f336fa6be54"} Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.048877 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.087657 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.092278 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.122147 4842 scope.go:117] "RemoveContainer" containerID="371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146159 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnsd\" (UniqueName: \"kubernetes.io/projected/7771ebaa-648a-46c4-986c-2cea25b5b7df-kube-api-access-xmnsd\") pod \"7771ebaa-648a-46c4-986c-2cea25b5b7df\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146216 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-utilities\") pod \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146619 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-utilities\") pod \"7857f1af-d426-446f-a295-05423f407554\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146648 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzkht\" (UniqueName: \"kubernetes.io/projected/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-kube-api-access-hzkht\") pod \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146678 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-operator-metrics\") pod \"6e8e2825-2a37-4731-bc73-4e469bc34334\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146700 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jr8f\" (UniqueName: \"kubernetes.io/projected/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-kube-api-access-9jr8f\") pod \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146772 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcthv\" (UniqueName: \"kubernetes.io/projected/7857f1af-d426-446f-a295-05423f407554-kube-api-access-dcthv\") pod \"7857f1af-d426-446f-a295-05423f407554\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146840 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-catalog-content\") pod \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146894 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-catalog-content\") pod \"7771ebaa-648a-46c4-986c-2cea25b5b7df\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146927 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-trusted-ca\") pod \"6e8e2825-2a37-4731-bc73-4e469bc34334\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.146984 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-catalog-content\") pod \"7857f1af-d426-446f-a295-05423f407554\" (UID: \"7857f1af-d426-446f-a295-05423f407554\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.147013 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgp6x\" (UniqueName: \"kubernetes.io/projected/6e8e2825-2a37-4731-bc73-4e469bc34334-kube-api-access-pgp6x\") pod \"6e8e2825-2a37-4731-bc73-4e469bc34334\" (UID: \"6e8e2825-2a37-4731-bc73-4e469bc34334\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.147063 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-utilities\") pod \"7771ebaa-648a-46c4-986c-2cea25b5b7df\" (UID: \"7771ebaa-648a-46c4-986c-2cea25b5b7df\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.147092 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-utilities\") pod \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\" (UID: \"ff824009-ab02-4a23-9c8a-76bc3d6a5f04\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.147168 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-catalog-content\") pod \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\" (UID: \"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7\") " Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.147441 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-utilities" (OuterVolumeSpecName: "utilities") pod "13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" (UID: "13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.148240 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-utilities" (OuterVolumeSpecName: "utilities") pod "7857f1af-d426-446f-a295-05423f407554" (UID: "7857f1af-d426-446f-a295-05423f407554"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.149211 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-utilities" (OuterVolumeSpecName: "utilities") pod "7771ebaa-648a-46c4-986c-2cea25b5b7df" (UID: "7771ebaa-648a-46c4-986c-2cea25b5b7df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.149939 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6e8e2825-2a37-4731-bc73-4e469bc34334" (UID: "6e8e2825-2a37-4731-bc73-4e469bc34334"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.153437 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-utilities" (OuterVolumeSpecName: "utilities") pod "ff824009-ab02-4a23-9c8a-76bc3d6a5f04" (UID: "ff824009-ab02-4a23-9c8a-76bc3d6a5f04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.154792 4842 scope.go:117] "RemoveContainer" containerID="a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.152994 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6e8e2825-2a37-4731-bc73-4e469bc34334" (UID: "6e8e2825-2a37-4731-bc73-4e469bc34334"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.156512 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7771ebaa-648a-46c4-986c-2cea25b5b7df-kube-api-access-xmnsd" (OuterVolumeSpecName: "kube-api-access-xmnsd") pod "7771ebaa-648a-46c4-986c-2cea25b5b7df" (UID: "7771ebaa-648a-46c4-986c-2cea25b5b7df"). InnerVolumeSpecName "kube-api-access-xmnsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.156617 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e8e2825-2a37-4731-bc73-4e469bc34334-kube-api-access-pgp6x" (OuterVolumeSpecName: "kube-api-access-pgp6x") pod "6e8e2825-2a37-4731-bc73-4e469bc34334" (UID: "6e8e2825-2a37-4731-bc73-4e469bc34334"). InnerVolumeSpecName "kube-api-access-pgp6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.159873 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-kube-api-access-9jr8f" (OuterVolumeSpecName: "kube-api-access-9jr8f") pod "13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" (UID: "13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7"). InnerVolumeSpecName "kube-api-access-9jr8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.164260 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7857f1af-d426-446f-a295-05423f407554-kube-api-access-dcthv" (OuterVolumeSpecName: "kube-api-access-dcthv") pod "7857f1af-d426-446f-a295-05423f407554" (UID: "7857f1af-d426-446f-a295-05423f407554"). InnerVolumeSpecName "kube-api-access-dcthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.169600 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-kube-api-access-hzkht" (OuterVolumeSpecName: "kube-api-access-hzkht") pod "ff824009-ab02-4a23-9c8a-76bc3d6a5f04" (UID: "ff824009-ab02-4a23-9c8a-76bc3d6a5f04"). InnerVolumeSpecName "kube-api-access-hzkht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.189494 4842 scope.go:117] "RemoveContainer" containerID="1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.189681 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" (UID: "13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.206007 4842 scope.go:117] "RemoveContainer" containerID="371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0" Mar 11 18:56:48 crc kubenswrapper[4842]: E0311 18:56:48.206849 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0\": container with ID starting with 371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0 not found: ID does not exist" containerID="371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.206903 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0"} err="failed to get container status \"371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0\": rpc error: code = NotFound desc = could not find container \"371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0\": container with ID starting with 371e0d7a28d579444c5ec5c547b2e002f3da7000c0649586305e33e0886964a0 not found: ID does not exist" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.206937 4842 scope.go:117] "RemoveContainer" containerID="a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6" Mar 11 18:56:48 crc kubenswrapper[4842]: E0311 18:56:48.207570 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6\": container with ID starting with a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6 not found: ID does not exist" containerID="a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.207608 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6"} err="failed to get container status \"a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6\": rpc error: code = NotFound desc = could not find container \"a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6\": container with ID starting with a4f637cd4fff292862e0072f9a7ff2b7e16a27a3dc0a244a81dfeba0b45e12e6 not found: ID does not exist" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.207639 4842 scope.go:117] "RemoveContainer" containerID="1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4" Mar 11 18:56:48 crc kubenswrapper[4842]: E0311 18:56:48.208106 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4\": container with ID starting with 1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4 not found: ID does not exist" containerID="1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.208191 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4"} err="failed to get container status \"1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4\": rpc error: code = NotFound desc = could not find container \"1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4\": container with ID starting with 1ac27bb79f20a60908d13ee29c887bfe11b8ce7d3f5b38062bdc416e76d7e7a4 not found: ID does not exist" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.230154 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff824009-ab02-4a23-9c8a-76bc3d6a5f04" (UID: "ff824009-ab02-4a23-9c8a-76bc3d6a5f04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.233470 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7771ebaa-648a-46c4-986c-2cea25b5b7df" (UID: "7771ebaa-648a-46c4-986c-2cea25b5b7df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248701 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcthv\" (UniqueName: \"kubernetes.io/projected/7857f1af-d426-446f-a295-05423f407554-kube-api-access-dcthv\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248747 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248759 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248768 4842 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248777 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgp6x\" (UniqueName: \"kubernetes.io/projected/6e8e2825-2a37-4731-bc73-4e469bc34334-kube-api-access-pgp6x\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248787 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7771ebaa-648a-46c4-986c-2cea25b5b7df-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248795 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248804 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248836 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnsd\" (UniqueName: \"kubernetes.io/projected/7771ebaa-648a-46c4-986c-2cea25b5b7df-kube-api-access-xmnsd\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248848 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.248859 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.249520 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzkht\" (UniqueName: \"kubernetes.io/projected/ff824009-ab02-4a23-9c8a-76bc3d6a5f04-kube-api-access-hzkht\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.249531 4842 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6e8e2825-2a37-4731-bc73-4e469bc34334-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.249541 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jr8f\" (UniqueName: \"kubernetes.io/projected/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7-kube-api-access-9jr8f\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.296610 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7857f1af-d426-446f-a295-05423f407554" (UID: "7857f1af-d426-446f-a295-05423f407554"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.351252 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7857f1af-d426-446f-a295-05423f407554-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.355733 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vb82w"] Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.364589 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vb82w"] Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.387008 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nqsd8"] Mar 11 18:56:48 crc kubenswrapper[4842]: W0311 18:56:48.389916 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45c9752_2328_495a_88dd_a6c769b7f012.slice/crio-cc2ef618a3836584e07aa8a496a603b18985f398c98114b317dfc6dc862be673 WatchSource:0}: Error finding container cc2ef618a3836584e07aa8a496a603b18985f398c98114b317dfc6dc862be673: Status 404 returned error can't find the container with id cc2ef618a3836584e07aa8a496a603b18985f398c98114b317dfc6dc862be673 Mar 11 18:56:48 crc kubenswrapper[4842]: I0311 18:56:48.970369 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" path="/var/lib/kubelet/pods/ff824009-ab02-4a23-9c8a-76bc3d6a5f04/volumes" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.029417 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rrvh" event={"ID":"13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7","Type":"ContainerDied","Data":"1d092e09cc5d4c02761f34cf1c4bfba02889d433eb8216b34d4d70671b527366"} Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.029491 4842 scope.go:117] "RemoveContainer" containerID="e64d291da777c56c62b53f1ae235ae903161067e39a1db556eaca3a58b30eb82" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.029746 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rrvh" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.030982 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" event={"ID":"6e8e2825-2a37-4731-bc73-4e469bc34334","Type":"ContainerDied","Data":"bbe847fa4ecb2b077b95ca3e23985e1f04995428ee05e1648d52f5e3aad1f6be"} Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.031430 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-czxnq" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.038010 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvszz" event={"ID":"7857f1af-d426-446f-a295-05423f407554","Type":"ContainerDied","Data":"2212ab128c3a804ebbf1814b2946bb9b4dc1dfe26608180e50d7ca8a8cff2a06"} Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.038136 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvszz" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.042805 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg2xr" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.042800 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" event={"ID":"e45c9752-2328-495a-88dd-a6c769b7f012","Type":"ContainerStarted","Data":"6af09562f2d184ce91327d598b826f0b42fb4178ce688b48dc097fb73bd5757e"} Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.043266 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" event={"ID":"e45c9752-2328-495a-88dd-a6c769b7f012","Type":"ContainerStarted","Data":"cc2ef618a3836584e07aa8a496a603b18985f398c98114b317dfc6dc862be673"} Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.043339 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.049196 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.051731 4842 scope.go:117] "RemoveContainer" containerID="3a0a4879a9940bb81bb8bdae6954db57043686dc5a432878a73c8edc4dc7b818" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.076019 4842 scope.go:117] "RemoveContainer" containerID="2b6c45befea2c91145e4d4a8b4637ccc830b1c282938fdcec488e831b3e07242" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.080349 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-czxnq"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.089869 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-czxnq"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.095265 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rrvh"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.103646 4842 scope.go:117] "RemoveContainer" containerID="0248cba12a4f8cac02bad816568b52d9150bf94c7a949fc3eaa3c1d571f46065" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.112242 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rrvh"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.119221 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvszz"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.126299 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvszz"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.135576 4842 scope.go:117] "RemoveContainer" containerID="a50cd1d4365c3d5e1b721201d2640c6d833ccf538c623ca1d7ca0f336fa6be54" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.137326 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg2xr"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.141014 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nqsd8" podStartSLOduration=2.14099593 podStartE2EDuration="2.14099593s" podCreationTimestamp="2026-03-11 18:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 18:56:49.133721759 +0000 UTC m=+454.781418029" watchObservedRunningTime="2026-03-11 18:56:49.14099593 +0000 UTC m=+454.788692210" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.143805 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bg2xr"] Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.156481 4842 scope.go:117] "RemoveContainer" containerID="af59d540f2aea45554f84a18d24024d2d5956499039d0121360a1d8ce98544e5" Mar 11 18:56:49 crc kubenswrapper[4842]: I0311 18:56:49.206021 4842 scope.go:117] "RemoveContainer" containerID="05505b8293d6a7b97c802c2d9e9f9d99e8ef9dd8dbc2cd67efe418a6680f8dc4" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.544822 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kzpq5"] Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545257 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545319 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545350 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545369 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545400 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545418 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545442 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545458 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545513 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545530 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545547 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545559 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545580 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545592 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545611 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545624 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="extract-content" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545642 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545654 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545669 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545682 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545701 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545714 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545731 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545743 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="extract-utilities" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545755 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545768 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" Mar 11 18:56:50 crc kubenswrapper[4842]: E0311 18:56:50.545791 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545803 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.545993 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7857f1af-d426-446f-a295-05423f407554" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.546012 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.546032 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.546055 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff824009-ab02-4a23-9c8a-76bc3d6a5f04" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.546068 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" containerName="marketplace-operator" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.546088 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" containerName="registry-server" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.561631 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kzpq5"] Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.562101 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.570211 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.692566 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55039ae-0af8-42cc-a100-7b8893fb9400-utilities\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.692601 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thw8n\" (UniqueName: \"kubernetes.io/projected/b55039ae-0af8-42cc-a100-7b8893fb9400-kube-api-access-thw8n\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.692641 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55039ae-0af8-42cc-a100-7b8893fb9400-catalog-content\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.794205 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55039ae-0af8-42cc-a100-7b8893fb9400-utilities\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.794255 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thw8n\" (UniqueName: \"kubernetes.io/projected/b55039ae-0af8-42cc-a100-7b8893fb9400-kube-api-access-thw8n\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.794337 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55039ae-0af8-42cc-a100-7b8893fb9400-catalog-content\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.794791 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55039ae-0af8-42cc-a100-7b8893fb9400-utilities\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.794857 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55039ae-0af8-42cc-a100-7b8893fb9400-catalog-content\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.811927 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thw8n\" (UniqueName: \"kubernetes.io/projected/b55039ae-0af8-42cc-a100-7b8893fb9400-kube-api-access-thw8n\") pod \"redhat-marketplace-kzpq5\" (UID: \"b55039ae-0af8-42cc-a100-7b8893fb9400\") " pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:50 crc kubenswrapper[4842]: I0311 18:56:50.892079 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:50.989328 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7" path="/var/lib/kubelet/pods/13a8bdba-b309-4c2d-b3f7-a4f1aa95d0a7/volumes" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:50.994548 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e8e2825-2a37-4731-bc73-4e469bc34334" path="/var/lib/kubelet/pods/6e8e2825-2a37-4731-bc73-4e469bc34334/volumes" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:50.995367 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7771ebaa-648a-46c4-986c-2cea25b5b7df" path="/var/lib/kubelet/pods/7771ebaa-648a-46c4-986c-2cea25b5b7df/volumes" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:50.996297 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7857f1af-d426-446f-a295-05423f407554" path="/var/lib/kubelet/pods/7857f1af-d426-446f-a295-05423f407554/volumes" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.142454 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jzgww"] Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.143607 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.147126 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.156052 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jzgww"] Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.203230 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f914e9d1-11a6-46bd-af88-7a238ade220f-catalog-content\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.203363 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f914e9d1-11a6-46bd-af88-7a238ade220f-utilities\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.203468 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmq5p\" (UniqueName: \"kubernetes.io/projected/f914e9d1-11a6-46bd-af88-7a238ade220f-kube-api-access-kmq5p\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.305143 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f914e9d1-11a6-46bd-af88-7a238ade220f-utilities\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.305252 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmq5p\" (UniqueName: \"kubernetes.io/projected/f914e9d1-11a6-46bd-af88-7a238ade220f-kube-api-access-kmq5p\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.305338 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f914e9d1-11a6-46bd-af88-7a238ade220f-catalog-content\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.306031 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f914e9d1-11a6-46bd-af88-7a238ade220f-catalog-content\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.306027 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f914e9d1-11a6-46bd-af88-7a238ade220f-utilities\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.372029 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmq5p\" (UniqueName: \"kubernetes.io/projected/f914e9d1-11a6-46bd-af88-7a238ade220f-kube-api-access-kmq5p\") pod \"redhat-operators-jzgww\" (UID: \"f914e9d1-11a6-46bd-af88-7a238ade220f\") " pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.389657 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kzpq5"] Mar 11 18:56:51 crc kubenswrapper[4842]: W0311 18:56:51.390351 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb55039ae_0af8_42cc_a100_7b8893fb9400.slice/crio-6c832074ff7f7fa39604ff42e4a6cb652aabc4c602be137a8e4f2240fd3abbe6 WatchSource:0}: Error finding container 6c832074ff7f7fa39604ff42e4a6cb652aabc4c602be137a8e4f2240fd3abbe6: Status 404 returned error can't find the container with id 6c832074ff7f7fa39604ff42e4a6cb652aabc4c602be137a8e4f2240fd3abbe6 Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.462365 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:56:51 crc kubenswrapper[4842]: I0311 18:56:51.638733 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jzgww"] Mar 11 18:56:51 crc kubenswrapper[4842]: W0311 18:56:51.660405 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf914e9d1_11a6_46bd_af88_7a238ade220f.slice/crio-2f24f3270d6f9a61da0634916200cf1864be2770b9a4314b1b8f9e68add10b90 WatchSource:0}: Error finding container 2f24f3270d6f9a61da0634916200cf1864be2770b9a4314b1b8f9e68add10b90: Status 404 returned error can't find the container with id 2f24f3270d6f9a61da0634916200cf1864be2770b9a4314b1b8f9e68add10b90 Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.063790 4842 generic.go:334] "Generic (PLEG): container finished" podID="f914e9d1-11a6-46bd-af88-7a238ade220f" containerID="28d501192ee57f8001f215d23d7bbb903b3ea41a804ebc216d8f7453c890f6e6" exitCode=0 Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.063858 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzgww" event={"ID":"f914e9d1-11a6-46bd-af88-7a238ade220f","Type":"ContainerDied","Data":"28d501192ee57f8001f215d23d7bbb903b3ea41a804ebc216d8f7453c890f6e6"} Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.063884 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzgww" event={"ID":"f914e9d1-11a6-46bd-af88-7a238ade220f","Type":"ContainerStarted","Data":"2f24f3270d6f9a61da0634916200cf1864be2770b9a4314b1b8f9e68add10b90"} Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.065862 4842 generic.go:334] "Generic (PLEG): container finished" podID="b55039ae-0af8-42cc-a100-7b8893fb9400" containerID="cac3b4bc14f203a918055f92987713d9d93b06dbb9f8e89f767e21c6eefa7cde" exitCode=0 Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.065914 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kzpq5" event={"ID":"b55039ae-0af8-42cc-a100-7b8893fb9400","Type":"ContainerDied","Data":"cac3b4bc14f203a918055f92987713d9d93b06dbb9f8e89f767e21c6eefa7cde"} Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.065944 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kzpq5" event={"ID":"b55039ae-0af8-42cc-a100-7b8893fb9400","Type":"ContainerStarted","Data":"6c832074ff7f7fa39604ff42e4a6cb652aabc4c602be137a8e4f2240fd3abbe6"} Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.937382 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jtlqq"] Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.938587 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.940332 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 18:56:52 crc kubenswrapper[4842]: I0311 18:56:52.958546 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtlqq"] Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.032173 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f56kx\" (UniqueName: \"kubernetes.io/projected/16d57199-f634-477a-a8d1-5c1f6c97f24b-kube-api-access-f56kx\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.032296 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d57199-f634-477a-a8d1-5c1f6c97f24b-catalog-content\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.032337 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d57199-f634-477a-a8d1-5c1f6c97f24b-utilities\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.084429 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzgww" event={"ID":"f914e9d1-11a6-46bd-af88-7a238ade220f","Type":"ContainerStarted","Data":"96be76d45c6e52b47c74bd51d9778e8e8f56e26f94f18b3f08b095d83d97dcfc"} Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.085637 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kzpq5" event={"ID":"b55039ae-0af8-42cc-a100-7b8893fb9400","Type":"ContainerStarted","Data":"3d41ae7b362e9868dbd3843b539ae38558f54666f7704e34f4bd93fde2e6e5f1"} Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.134003 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f56kx\" (UniqueName: \"kubernetes.io/projected/16d57199-f634-477a-a8d1-5c1f6c97f24b-kube-api-access-f56kx\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.134065 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d57199-f634-477a-a8d1-5c1f6c97f24b-catalog-content\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.134096 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d57199-f634-477a-a8d1-5c1f6c97f24b-utilities\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.134583 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d57199-f634-477a-a8d1-5c1f6c97f24b-utilities\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.135004 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d57199-f634-477a-a8d1-5c1f6c97f24b-catalog-content\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.159549 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f56kx\" (UniqueName: \"kubernetes.io/projected/16d57199-f634-477a-a8d1-5c1f6c97f24b-kube-api-access-f56kx\") pod \"community-operators-jtlqq\" (UID: \"16d57199-f634-477a-a8d1-5c1f6c97f24b\") " pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.257753 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.426023 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtlqq"] Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.537931 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c75p2"] Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.539876 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.541837 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.546640 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c75p2"] Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.639553 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503b63c7-1278-4eec-84bc-86223fe3ad04-utilities\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.639774 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503b63c7-1278-4eec-84bc-86223fe3ad04-catalog-content\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.639851 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sj96\" (UniqueName: \"kubernetes.io/projected/503b63c7-1278-4eec-84bc-86223fe3ad04-kube-api-access-7sj96\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.741937 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503b63c7-1278-4eec-84bc-86223fe3ad04-catalog-content\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.741982 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sj96\" (UniqueName: \"kubernetes.io/projected/503b63c7-1278-4eec-84bc-86223fe3ad04-kube-api-access-7sj96\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.742022 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503b63c7-1278-4eec-84bc-86223fe3ad04-utilities\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.742580 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503b63c7-1278-4eec-84bc-86223fe3ad04-utilities\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.742590 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503b63c7-1278-4eec-84bc-86223fe3ad04-catalog-content\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.760742 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sj96\" (UniqueName: \"kubernetes.io/projected/503b63c7-1278-4eec-84bc-86223fe3ad04-kube-api-access-7sj96\") pod \"certified-operators-c75p2\" (UID: \"503b63c7-1278-4eec-84bc-86223fe3ad04\") " pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:53 crc kubenswrapper[4842]: I0311 18:56:53.878631 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.091983 4842 generic.go:334] "Generic (PLEG): container finished" podID="16d57199-f634-477a-a8d1-5c1f6c97f24b" containerID="febc7c19e2aa66eb6c3320048f1f134f19d87ff7db0300f6de4bd51a6c446c83" exitCode=0 Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.092030 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtlqq" event={"ID":"16d57199-f634-477a-a8d1-5c1f6c97f24b","Type":"ContainerDied","Data":"febc7c19e2aa66eb6c3320048f1f134f19d87ff7db0300f6de4bd51a6c446c83"} Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.092071 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtlqq" event={"ID":"16d57199-f634-477a-a8d1-5c1f6c97f24b","Type":"ContainerStarted","Data":"896036a39140d5a794f66203dc2ae35a86d8d622b0080c78cba4c5bce21b576f"} Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.094018 4842 generic.go:334] "Generic (PLEG): container finished" podID="f914e9d1-11a6-46bd-af88-7a238ade220f" containerID="96be76d45c6e52b47c74bd51d9778e8e8f56e26f94f18b3f08b095d83d97dcfc" exitCode=0 Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.094106 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzgww" event={"ID":"f914e9d1-11a6-46bd-af88-7a238ade220f","Type":"ContainerDied","Data":"96be76d45c6e52b47c74bd51d9778e8e8f56e26f94f18b3f08b095d83d97dcfc"} Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.096584 4842 generic.go:334] "Generic (PLEG): container finished" podID="b55039ae-0af8-42cc-a100-7b8893fb9400" containerID="3d41ae7b362e9868dbd3843b539ae38558f54666f7704e34f4bd93fde2e6e5f1" exitCode=0 Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.096615 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kzpq5" event={"ID":"b55039ae-0af8-42cc-a100-7b8893fb9400","Type":"ContainerDied","Data":"3d41ae7b362e9868dbd3843b539ae38558f54666f7704e34f4bd93fde2e6e5f1"} Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.646369 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hcnw9" Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.700458 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-twwzj"] Mar 11 18:56:54 crc kubenswrapper[4842]: I0311 18:56:54.791679 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c75p2"] Mar 11 18:56:54 crc kubenswrapper[4842]: W0311 18:56:54.797639 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod503b63c7_1278_4eec_84bc_86223fe3ad04.slice/crio-9a3d778ac66f42b39c7911b6fec2a706a2c382e0763d905e54f65195a44f132c WatchSource:0}: Error finding container 9a3d778ac66f42b39c7911b6fec2a706a2c382e0763d905e54f65195a44f132c: Status 404 returned error can't find the container with id 9a3d778ac66f42b39c7911b6fec2a706a2c382e0763d905e54f65195a44f132c Mar 11 18:56:55 crc kubenswrapper[4842]: I0311 18:56:55.105979 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jzgww" event={"ID":"f914e9d1-11a6-46bd-af88-7a238ade220f","Type":"ContainerStarted","Data":"c96cf9f07306b723c6fea8e40f41e1a74621cfd57373a2a902d50cfd9820e171"} Mar 11 18:56:55 crc kubenswrapper[4842]: I0311 18:56:55.108303 4842 generic.go:334] "Generic (PLEG): container finished" podID="503b63c7-1278-4eec-84bc-86223fe3ad04" containerID="0a97457a759ccea3eba0f3b5cbb15c9808ad2284314ce1db3d032999b87d27bc" exitCode=0 Mar 11 18:56:55 crc kubenswrapper[4842]: I0311 18:56:55.108344 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c75p2" event={"ID":"503b63c7-1278-4eec-84bc-86223fe3ad04","Type":"ContainerDied","Data":"0a97457a759ccea3eba0f3b5cbb15c9808ad2284314ce1db3d032999b87d27bc"} Mar 11 18:56:55 crc kubenswrapper[4842]: I0311 18:56:55.108360 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c75p2" event={"ID":"503b63c7-1278-4eec-84bc-86223fe3ad04","Type":"ContainerStarted","Data":"9a3d778ac66f42b39c7911b6fec2a706a2c382e0763d905e54f65195a44f132c"} Mar 11 18:56:55 crc kubenswrapper[4842]: I0311 18:56:55.113124 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kzpq5" event={"ID":"b55039ae-0af8-42cc-a100-7b8893fb9400","Type":"ContainerStarted","Data":"5e4afb50722638b54312705291aa4107a3089c55ade2d195fc10a3444c6862e0"} Mar 11 18:56:55 crc kubenswrapper[4842]: I0311 18:56:55.136881 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jzgww" podStartSLOduration=1.627979706 podStartE2EDuration="4.13686583s" podCreationTimestamp="2026-03-11 18:56:51 +0000 UTC" firstStartedPulling="2026-03-11 18:56:52.064926428 +0000 UTC m=+457.712622708" lastFinishedPulling="2026-03-11 18:56:54.573812552 +0000 UTC m=+460.221508832" observedRunningTime="2026-03-11 18:56:55.134909075 +0000 UTC m=+460.782605355" watchObservedRunningTime="2026-03-11 18:56:55.13686583 +0000 UTC m=+460.784562110" Mar 11 18:56:55 crc kubenswrapper[4842]: I0311 18:56:55.154237 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kzpq5" podStartSLOduration=2.6278039509999997 podStartE2EDuration="5.154221269s" podCreationTimestamp="2026-03-11 18:56:50 +0000 UTC" firstStartedPulling="2026-03-11 18:56:52.067420129 +0000 UTC m=+457.715116409" lastFinishedPulling="2026-03-11 18:56:54.593837447 +0000 UTC m=+460.241533727" observedRunningTime="2026-03-11 18:56:55.15210761 +0000 UTC m=+460.799803890" watchObservedRunningTime="2026-03-11 18:56:55.154221269 +0000 UTC m=+460.801917549" Mar 11 18:56:56 crc kubenswrapper[4842]: I0311 18:56:56.119609 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c75p2" event={"ID":"503b63c7-1278-4eec-84bc-86223fe3ad04","Type":"ContainerStarted","Data":"0e06c5459d2957cd2b9bcbff64fc02c6962bb13d38d714739141c3dae1b51b00"} Mar 11 18:56:56 crc kubenswrapper[4842]: I0311 18:56:56.121793 4842 generic.go:334] "Generic (PLEG): container finished" podID="16d57199-f634-477a-a8d1-5c1f6c97f24b" containerID="02d64b73c66ced31b66038dc97af049b6459054e06474d4a4c9fb29e62a83871" exitCode=0 Mar 11 18:56:56 crc kubenswrapper[4842]: I0311 18:56:56.123246 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtlqq" event={"ID":"16d57199-f634-477a-a8d1-5c1f6c97f24b","Type":"ContainerDied","Data":"02d64b73c66ced31b66038dc97af049b6459054e06474d4a4c9fb29e62a83871"} Mar 11 18:56:57 crc kubenswrapper[4842]: I0311 18:56:57.128370 4842 generic.go:334] "Generic (PLEG): container finished" podID="503b63c7-1278-4eec-84bc-86223fe3ad04" containerID="0e06c5459d2957cd2b9bcbff64fc02c6962bb13d38d714739141c3dae1b51b00" exitCode=0 Mar 11 18:56:57 crc kubenswrapper[4842]: I0311 18:56:57.128701 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c75p2" event={"ID":"503b63c7-1278-4eec-84bc-86223fe3ad04","Type":"ContainerDied","Data":"0e06c5459d2957cd2b9bcbff64fc02c6962bb13d38d714739141c3dae1b51b00"} Mar 11 18:56:57 crc kubenswrapper[4842]: I0311 18:56:57.136461 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtlqq" event={"ID":"16d57199-f634-477a-a8d1-5c1f6c97f24b","Type":"ContainerStarted","Data":"310487be867981568685424916f8be1d765534df24f6d39cd5e9c1c8e473e70a"} Mar 11 18:56:57 crc kubenswrapper[4842]: I0311 18:56:57.165112 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jtlqq" podStartSLOduration=2.676718883 podStartE2EDuration="5.165097169s" podCreationTimestamp="2026-03-11 18:56:52 +0000 UTC" firstStartedPulling="2026-03-11 18:56:54.0936448 +0000 UTC m=+459.741341090" lastFinishedPulling="2026-03-11 18:56:56.582023086 +0000 UTC m=+462.229719376" observedRunningTime="2026-03-11 18:56:57.163204825 +0000 UTC m=+462.810901115" watchObservedRunningTime="2026-03-11 18:56:57.165097169 +0000 UTC m=+462.812793449" Mar 11 18:56:58 crc kubenswrapper[4842]: I0311 18:56:58.143541 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c75p2" event={"ID":"503b63c7-1278-4eec-84bc-86223fe3ad04","Type":"ContainerStarted","Data":"a417deb0cbf201a4b74b9700ddc106ed89c7d28ab314bb3cea87d3dffdb7dc3a"} Mar 11 18:56:58 crc kubenswrapper[4842]: I0311 18:56:58.163448 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c75p2" podStartSLOduration=2.432652086 podStartE2EDuration="5.163431203s" podCreationTimestamp="2026-03-11 18:56:53 +0000 UTC" firstStartedPulling="2026-03-11 18:56:55.109558051 +0000 UTC m=+460.757254331" lastFinishedPulling="2026-03-11 18:56:57.840337128 +0000 UTC m=+463.488033448" observedRunningTime="2026-03-11 18:56:58.159124642 +0000 UTC m=+463.806820932" watchObservedRunningTime="2026-03-11 18:56:58.163431203 +0000 UTC m=+463.811127483" Mar 11 18:57:00 crc kubenswrapper[4842]: I0311 18:57:00.892777 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:57:00 crc kubenswrapper[4842]: I0311 18:57:00.893895 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:57:00 crc kubenswrapper[4842]: I0311 18:57:00.958291 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.212699 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kzpq5" Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.462871 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.463222 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.471644 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.471725 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.471787 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.472457 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4a7b65bec67b2a820939afb6031ff12cd763991ed39162f5d44f041b4219c2a"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 18:57:01 crc kubenswrapper[4842]: I0311 18:57:01.472553 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://b4a7b65bec67b2a820939afb6031ff12cd763991ed39162f5d44f041b4219c2a" gracePeriod=600 Mar 11 18:57:02 crc kubenswrapper[4842]: I0311 18:57:02.173897 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="b4a7b65bec67b2a820939afb6031ff12cd763991ed39162f5d44f041b4219c2a" exitCode=0 Mar 11 18:57:02 crc kubenswrapper[4842]: I0311 18:57:02.174030 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"b4a7b65bec67b2a820939afb6031ff12cd763991ed39162f5d44f041b4219c2a"} Mar 11 18:57:02 crc kubenswrapper[4842]: I0311 18:57:02.174235 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"fd8f5bc1cb45867bb48f671ce0a50d33b14c76234538c7c5a98f898ebef3f305"} Mar 11 18:57:02 crc kubenswrapper[4842]: I0311 18:57:02.174305 4842 scope.go:117] "RemoveContainer" containerID="284abb694e28384a136a1f5f2aa04ad6f27a892d21c0f320572c21f1735736cd" Mar 11 18:57:02 crc kubenswrapper[4842]: I0311 18:57:02.507098 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jzgww" podUID="f914e9d1-11a6-46bd-af88-7a238ade220f" containerName="registry-server" probeResult="failure" output=< Mar 11 18:57:02 crc kubenswrapper[4842]: timeout: failed to connect service ":50051" within 1s Mar 11 18:57:02 crc kubenswrapper[4842]: > Mar 11 18:57:03 crc kubenswrapper[4842]: I0311 18:57:03.258527 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:57:03 crc kubenswrapper[4842]: I0311 18:57:03.259518 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:57:03 crc kubenswrapper[4842]: I0311 18:57:03.304209 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:57:03 crc kubenswrapper[4842]: I0311 18:57:03.878995 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:57:03 crc kubenswrapper[4842]: I0311 18:57:03.879326 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:57:03 crc kubenswrapper[4842]: I0311 18:57:03.917136 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:57:04 crc kubenswrapper[4842]: I0311 18:57:04.239169 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jtlqq" Mar 11 18:57:04 crc kubenswrapper[4842]: I0311 18:57:04.253961 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c75p2" Mar 11 18:57:11 crc kubenswrapper[4842]: I0311 18:57:11.530649 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:57:11 crc kubenswrapper[4842]: I0311 18:57:11.607603 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jzgww" Mar 11 18:57:19 crc kubenswrapper[4842]: I0311 18:57:19.750437 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" podUID="1882c06f-22f3-4346-8435-418f034f7d09" containerName="registry" containerID="cri-o://5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3" gracePeriod=30 Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.103581 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.270994 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-trusted-ca\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.271055 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1882c06f-22f3-4346-8435-418f034f7d09-ca-trust-extracted\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.271243 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.271280 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1882c06f-22f3-4346-8435-418f034f7d09-installation-pull-secrets\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.271364 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-registry-tls\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.271381 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krnkh\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-kube-api-access-krnkh\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.271405 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-registry-certificates\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.271429 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-bound-sa-token\") pod \"1882c06f-22f3-4346-8435-418f034f7d09\" (UID: \"1882c06f-22f3-4346-8435-418f034f7d09\") " Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.272069 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.272289 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.279226 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.279460 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.279942 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1882c06f-22f3-4346-8435-418f034f7d09-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.280788 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-kube-api-access-krnkh" (OuterVolumeSpecName: "kube-api-access-krnkh") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "kube-api-access-krnkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.294207 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.295184 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1882c06f-22f3-4346-8435-418f034f7d09-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1882c06f-22f3-4346-8435-418f034f7d09" (UID: "1882c06f-22f3-4346-8435-418f034f7d09"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.373055 4842 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.373091 4842 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.373102 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1882c06f-22f3-4346-8435-418f034f7d09-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.373114 4842 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1882c06f-22f3-4346-8435-418f034f7d09-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.373123 4842 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1882c06f-22f3-4346-8435-418f034f7d09-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.373131 4842 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.373139 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krnkh\" (UniqueName: \"kubernetes.io/projected/1882c06f-22f3-4346-8435-418f034f7d09-kube-api-access-krnkh\") on node \"crc\" DevicePath \"\"" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.430212 4842 generic.go:334] "Generic (PLEG): container finished" podID="1882c06f-22f3-4346-8435-418f034f7d09" containerID="5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3" exitCode=0 Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.430259 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" event={"ID":"1882c06f-22f3-4346-8435-418f034f7d09","Type":"ContainerDied","Data":"5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3"} Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.430283 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.430306 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-twwzj" event={"ID":"1882c06f-22f3-4346-8435-418f034f7d09","Type":"ContainerDied","Data":"ffc89008597dee3227bf49cd34a9f5ea37e250ae3d21ef0eb0c4643e917eba40"} Mar 11 18:57:20 crc kubenswrapper[4842]: I0311 18:57:20.430331 4842 scope.go:117] "RemoveContainer" containerID="5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3" Mar 11 18:57:21 crc kubenswrapper[4842]: I0311 18:57:21.408340 4842 scope.go:117] "RemoveContainer" containerID="5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3" Mar 11 18:57:21 crc kubenswrapper[4842]: E0311 18:57:21.410741 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3\": container with ID starting with 5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3 not found: ID does not exist" containerID="5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3" Mar 11 18:57:21 crc kubenswrapper[4842]: I0311 18:57:21.410806 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3"} err="failed to get container status \"5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3\": rpc error: code = NotFound desc = could not find container \"5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3\": container with ID starting with 5ec1905bf5305f7e8ede0759502ffef0602b2f8acf387a863db40286c541b1d3 not found: ID does not exist" Mar 11 18:57:21 crc kubenswrapper[4842]: I0311 18:57:21.411163 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-twwzj"] Mar 11 18:57:21 crc kubenswrapper[4842]: I0311 18:57:21.414235 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-twwzj"] Mar 11 18:57:22 crc kubenswrapper[4842]: I0311 18:57:22.973480 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1882c06f-22f3-4346-8435-418f034f7d09" path="/var/lib/kubelet/pods/1882c06f-22f3-4346-8435-418f034f7d09/volumes" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.144091 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554258-vbkkd"] Mar 11 18:58:00 crc kubenswrapper[4842]: E0311 18:58:00.145098 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1882c06f-22f3-4346-8435-418f034f7d09" containerName="registry" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.145118 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1882c06f-22f3-4346-8435-418f034f7d09" containerName="registry" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.145276 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1882c06f-22f3-4346-8435-418f034f7d09" containerName="registry" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.145849 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554258-vbkkd" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.155669 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.155819 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.156078 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.157885 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554258-vbkkd"] Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.216885 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6qh\" (UniqueName: \"kubernetes.io/projected/5f246fb1-5e30-44fc-b041-8474f40b3936-kube-api-access-mz6qh\") pod \"auto-csr-approver-29554258-vbkkd\" (UID: \"5f246fb1-5e30-44fc-b041-8474f40b3936\") " pod="openshift-infra/auto-csr-approver-29554258-vbkkd" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.318921 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6qh\" (UniqueName: \"kubernetes.io/projected/5f246fb1-5e30-44fc-b041-8474f40b3936-kube-api-access-mz6qh\") pod \"auto-csr-approver-29554258-vbkkd\" (UID: \"5f246fb1-5e30-44fc-b041-8474f40b3936\") " pod="openshift-infra/auto-csr-approver-29554258-vbkkd" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.354136 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6qh\" (UniqueName: \"kubernetes.io/projected/5f246fb1-5e30-44fc-b041-8474f40b3936-kube-api-access-mz6qh\") pod \"auto-csr-approver-29554258-vbkkd\" (UID: \"5f246fb1-5e30-44fc-b041-8474f40b3936\") " pod="openshift-infra/auto-csr-approver-29554258-vbkkd" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.483338 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554258-vbkkd" Mar 11 18:58:00 crc kubenswrapper[4842]: I0311 18:58:00.710122 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554258-vbkkd"] Mar 11 18:58:00 crc kubenswrapper[4842]: W0311 18:58:00.717183 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f246fb1_5e30_44fc_b041_8474f40b3936.slice/crio-1213b89d3b6061b016102e92b6d0cff1045102af8874040219c4f1351905bfa9 WatchSource:0}: Error finding container 1213b89d3b6061b016102e92b6d0cff1045102af8874040219c4f1351905bfa9: Status 404 returned error can't find the container with id 1213b89d3b6061b016102e92b6d0cff1045102af8874040219c4f1351905bfa9 Mar 11 18:58:01 crc kubenswrapper[4842]: I0311 18:58:01.691356 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554258-vbkkd" event={"ID":"5f246fb1-5e30-44fc-b041-8474f40b3936","Type":"ContainerStarted","Data":"1213b89d3b6061b016102e92b6d0cff1045102af8874040219c4f1351905bfa9"} Mar 11 18:58:02 crc kubenswrapper[4842]: I0311 18:58:02.701917 4842 generic.go:334] "Generic (PLEG): container finished" podID="5f246fb1-5e30-44fc-b041-8474f40b3936" containerID="7077e3ce76b911406b7ad5c2cb13c14ae14dab9da0067145247d697f3a09100f" exitCode=0 Mar 11 18:58:02 crc kubenswrapper[4842]: I0311 18:58:02.702018 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554258-vbkkd" event={"ID":"5f246fb1-5e30-44fc-b041-8474f40b3936","Type":"ContainerDied","Data":"7077e3ce76b911406b7ad5c2cb13c14ae14dab9da0067145247d697f3a09100f"} Mar 11 18:58:03 crc kubenswrapper[4842]: I0311 18:58:03.961855 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554258-vbkkd" Mar 11 18:58:04 crc kubenswrapper[4842]: I0311 18:58:04.066874 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz6qh\" (UniqueName: \"kubernetes.io/projected/5f246fb1-5e30-44fc-b041-8474f40b3936-kube-api-access-mz6qh\") pod \"5f246fb1-5e30-44fc-b041-8474f40b3936\" (UID: \"5f246fb1-5e30-44fc-b041-8474f40b3936\") " Mar 11 18:58:04 crc kubenswrapper[4842]: I0311 18:58:04.074187 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f246fb1-5e30-44fc-b041-8474f40b3936-kube-api-access-mz6qh" (OuterVolumeSpecName: "kube-api-access-mz6qh") pod "5f246fb1-5e30-44fc-b041-8474f40b3936" (UID: "5f246fb1-5e30-44fc-b041-8474f40b3936"). InnerVolumeSpecName "kube-api-access-mz6qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 18:58:04 crc kubenswrapper[4842]: I0311 18:58:04.168442 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz6qh\" (UniqueName: \"kubernetes.io/projected/5f246fb1-5e30-44fc-b041-8474f40b3936-kube-api-access-mz6qh\") on node \"crc\" DevicePath \"\"" Mar 11 18:58:04 crc kubenswrapper[4842]: I0311 18:58:04.720222 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554258-vbkkd" event={"ID":"5f246fb1-5e30-44fc-b041-8474f40b3936","Type":"ContainerDied","Data":"1213b89d3b6061b016102e92b6d0cff1045102af8874040219c4f1351905bfa9"} Mar 11 18:58:04 crc kubenswrapper[4842]: I0311 18:58:04.720262 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1213b89d3b6061b016102e92b6d0cff1045102af8874040219c4f1351905bfa9" Mar 11 18:58:04 crc kubenswrapper[4842]: I0311 18:58:04.720309 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554258-vbkkd" Mar 11 18:58:05 crc kubenswrapper[4842]: I0311 18:58:05.020615 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554252-4qdmf"] Mar 11 18:58:05 crc kubenswrapper[4842]: I0311 18:58:05.023706 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554252-4qdmf"] Mar 11 18:58:06 crc kubenswrapper[4842]: I0311 18:58:06.969750 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a9b79e-4043-4dc7-b625-53e0962a745b" path="/var/lib/kubelet/pods/30a9b79e-4043-4dc7-b625-53e0962a745b/volumes" Mar 11 18:59:01 crc kubenswrapper[4842]: I0311 18:59:01.472066 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:59:01 crc kubenswrapper[4842]: I0311 18:59:01.474247 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 18:59:23 crc kubenswrapper[4842]: I0311 18:59:23.648896 4842 scope.go:117] "RemoveContainer" containerID="c4da4933ba4b771058697b831475abbcbd9dbc33c9eb105659ae7a216b3cb42d" Mar 11 18:59:31 crc kubenswrapper[4842]: I0311 18:59:31.472186 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 18:59:31 crc kubenswrapper[4842]: I0311 18:59:31.472506 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.137221 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84"] Mar 11 19:00:00 crc kubenswrapper[4842]: E0311 19:00:00.138141 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f246fb1-5e30-44fc-b041-8474f40b3936" containerName="oc" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.138156 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f246fb1-5e30-44fc-b041-8474f40b3936" containerName="oc" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.138254 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f246fb1-5e30-44fc-b041-8474f40b3936" containerName="oc" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.138764 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.140769 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.143696 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.144066 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554260-bkntv"] Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.145103 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554260-bkntv" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.153246 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.153328 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.153482 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.159793 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84"] Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.166686 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554260-bkntv"] Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.179779 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa02634-f796-4d40-8bab-02ddff4401f6-secret-volume\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.179852 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa02634-f796-4d40-8bab-02ddff4401f6-config-volume\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.179928 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99d8n\" (UniqueName: \"kubernetes.io/projected/a6198d49-2e45-41da-9980-ada387fc0276-kube-api-access-99d8n\") pod \"auto-csr-approver-29554260-bkntv\" (UID: \"a6198d49-2e45-41da-9980-ada387fc0276\") " pod="openshift-infra/auto-csr-approver-29554260-bkntv" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.180009 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jxd7\" (UniqueName: \"kubernetes.io/projected/1aa02634-f796-4d40-8bab-02ddff4401f6-kube-api-access-6jxd7\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.280816 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa02634-f796-4d40-8bab-02ddff4401f6-config-volume\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.281125 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99d8n\" (UniqueName: \"kubernetes.io/projected/a6198d49-2e45-41da-9980-ada387fc0276-kube-api-access-99d8n\") pod \"auto-csr-approver-29554260-bkntv\" (UID: \"a6198d49-2e45-41da-9980-ada387fc0276\") " pod="openshift-infra/auto-csr-approver-29554260-bkntv" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.281281 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jxd7\" (UniqueName: \"kubernetes.io/projected/1aa02634-f796-4d40-8bab-02ddff4401f6-kube-api-access-6jxd7\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.281429 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa02634-f796-4d40-8bab-02ddff4401f6-secret-volume\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.282078 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa02634-f796-4d40-8bab-02ddff4401f6-config-volume\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.296081 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa02634-f796-4d40-8bab-02ddff4401f6-secret-volume\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.299802 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99d8n\" (UniqueName: \"kubernetes.io/projected/a6198d49-2e45-41da-9980-ada387fc0276-kube-api-access-99d8n\") pod \"auto-csr-approver-29554260-bkntv\" (UID: \"a6198d49-2e45-41da-9980-ada387fc0276\") " pod="openshift-infra/auto-csr-approver-29554260-bkntv" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.312612 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jxd7\" (UniqueName: \"kubernetes.io/projected/1aa02634-f796-4d40-8bab-02ddff4401f6-kube-api-access-6jxd7\") pod \"collect-profiles-29554260-gpr84\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.459113 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.470737 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554260-bkntv" Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.651889 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554260-bkntv"] Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.659129 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 19:00:00 crc kubenswrapper[4842]: I0311 19:00:00.693798 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84"] Mar 11 19:00:00 crc kubenswrapper[4842]: W0311 19:00:00.700459 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa02634_f796_4d40_8bab_02ddff4401f6.slice/crio-139a15de179c81f85762f149c67bbdb4f2fdc5ec4f7f2c244aefbb19682aa5e7 WatchSource:0}: Error finding container 139a15de179c81f85762f149c67bbdb4f2fdc5ec4f7f2c244aefbb19682aa5e7: Status 404 returned error can't find the container with id 139a15de179c81f85762f149c67bbdb4f2fdc5ec4f7f2c244aefbb19682aa5e7 Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.472174 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.472481 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.472527 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.473021 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd8f5bc1cb45867bb48f671ce0a50d33b14c76234538c7c5a98f898ebef3f305"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.473063 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://fd8f5bc1cb45867bb48f671ce0a50d33b14c76234538c7c5a98f898ebef3f305" gracePeriod=600 Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.659420 4842 generic.go:334] "Generic (PLEG): container finished" podID="1aa02634-f796-4d40-8bab-02ddff4401f6" containerID="a9292407e586b32b0e4f413a45c8cff8fff816fe84cd1efd0a72e633a9eb37f5" exitCode=0 Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.659492 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" event={"ID":"1aa02634-f796-4d40-8bab-02ddff4401f6","Type":"ContainerDied","Data":"a9292407e586b32b0e4f413a45c8cff8fff816fe84cd1efd0a72e633a9eb37f5"} Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.659521 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" event={"ID":"1aa02634-f796-4d40-8bab-02ddff4401f6","Type":"ContainerStarted","Data":"139a15de179c81f85762f149c67bbdb4f2fdc5ec4f7f2c244aefbb19682aa5e7"} Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.662220 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="fd8f5bc1cb45867bb48f671ce0a50d33b14c76234538c7c5a98f898ebef3f305" exitCode=0 Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.662253 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"fd8f5bc1cb45867bb48f671ce0a50d33b14c76234538c7c5a98f898ebef3f305"} Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.662293 4842 scope.go:117] "RemoveContainer" containerID="b4a7b65bec67b2a820939afb6031ff12cd763991ed39162f5d44f041b4219c2a" Mar 11 19:00:01 crc kubenswrapper[4842]: I0311 19:00:01.664222 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554260-bkntv" event={"ID":"a6198d49-2e45-41da-9980-ada387fc0276","Type":"ContainerStarted","Data":"1dfa53af81b500efb77633ad2de595ea565503471cb88195038b687e35071495"} Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.674540 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"dd17d155f0763fe0e3f142ca18755ab7a2e8fd0c5e83a7bdd2e0037d15a4c528"} Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.906244 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.924647 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jxd7\" (UniqueName: \"kubernetes.io/projected/1aa02634-f796-4d40-8bab-02ddff4401f6-kube-api-access-6jxd7\") pod \"1aa02634-f796-4d40-8bab-02ddff4401f6\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.924697 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa02634-f796-4d40-8bab-02ddff4401f6-secret-volume\") pod \"1aa02634-f796-4d40-8bab-02ddff4401f6\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.924758 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa02634-f796-4d40-8bab-02ddff4401f6-config-volume\") pod \"1aa02634-f796-4d40-8bab-02ddff4401f6\" (UID: \"1aa02634-f796-4d40-8bab-02ddff4401f6\") " Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.925894 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa02634-f796-4d40-8bab-02ddff4401f6-config-volume" (OuterVolumeSpecName: "config-volume") pod "1aa02634-f796-4d40-8bab-02ddff4401f6" (UID: "1aa02634-f796-4d40-8bab-02ddff4401f6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.940277 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa02634-f796-4d40-8bab-02ddff4401f6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1aa02634-f796-4d40-8bab-02ddff4401f6" (UID: "1aa02634-f796-4d40-8bab-02ddff4401f6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:00:02 crc kubenswrapper[4842]: I0311 19:00:02.940317 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa02634-f796-4d40-8bab-02ddff4401f6-kube-api-access-6jxd7" (OuterVolumeSpecName: "kube-api-access-6jxd7") pod "1aa02634-f796-4d40-8bab-02ddff4401f6" (UID: "1aa02634-f796-4d40-8bab-02ddff4401f6"). InnerVolumeSpecName "kube-api-access-6jxd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:00:03 crc kubenswrapper[4842]: I0311 19:00:03.026346 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jxd7\" (UniqueName: \"kubernetes.io/projected/1aa02634-f796-4d40-8bab-02ddff4401f6-kube-api-access-6jxd7\") on node \"crc\" DevicePath \"\"" Mar 11 19:00:03 crc kubenswrapper[4842]: I0311 19:00:03.026406 4842 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa02634-f796-4d40-8bab-02ddff4401f6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 19:00:03 crc kubenswrapper[4842]: I0311 19:00:03.026419 4842 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa02634-f796-4d40-8bab-02ddff4401f6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 19:00:03 crc kubenswrapper[4842]: I0311 19:00:03.688855 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" Mar 11 19:00:03 crc kubenswrapper[4842]: I0311 19:00:03.689323 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554260-gpr84" event={"ID":"1aa02634-f796-4d40-8bab-02ddff4401f6","Type":"ContainerDied","Data":"139a15de179c81f85762f149c67bbdb4f2fdc5ec4f7f2c244aefbb19682aa5e7"} Mar 11 19:00:03 crc kubenswrapper[4842]: I0311 19:00:03.689375 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="139a15de179c81f85762f149c67bbdb4f2fdc5ec4f7f2c244aefbb19682aa5e7" Mar 11 19:00:04 crc kubenswrapper[4842]: I0311 19:00:04.695755 4842 generic.go:334] "Generic (PLEG): container finished" podID="a6198d49-2e45-41da-9980-ada387fc0276" containerID="fce9e339213a9e76d6851137264b2cd4b9473d4dca280a11a64445a80425708a" exitCode=0 Mar 11 19:00:04 crc kubenswrapper[4842]: I0311 19:00:04.695823 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554260-bkntv" event={"ID":"a6198d49-2e45-41da-9980-ada387fc0276","Type":"ContainerDied","Data":"fce9e339213a9e76d6851137264b2cd4b9473d4dca280a11a64445a80425708a"} Mar 11 19:00:05 crc kubenswrapper[4842]: I0311 19:00:05.909432 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554260-bkntv" Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.057524 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99d8n\" (UniqueName: \"kubernetes.io/projected/a6198d49-2e45-41da-9980-ada387fc0276-kube-api-access-99d8n\") pod \"a6198d49-2e45-41da-9980-ada387fc0276\" (UID: \"a6198d49-2e45-41da-9980-ada387fc0276\") " Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.069229 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6198d49-2e45-41da-9980-ada387fc0276-kube-api-access-99d8n" (OuterVolumeSpecName: "kube-api-access-99d8n") pod "a6198d49-2e45-41da-9980-ada387fc0276" (UID: "a6198d49-2e45-41da-9980-ada387fc0276"). InnerVolumeSpecName "kube-api-access-99d8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.158801 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99d8n\" (UniqueName: \"kubernetes.io/projected/a6198d49-2e45-41da-9980-ada387fc0276-kube-api-access-99d8n\") on node \"crc\" DevicePath \"\"" Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.710484 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554260-bkntv" event={"ID":"a6198d49-2e45-41da-9980-ada387fc0276","Type":"ContainerDied","Data":"1dfa53af81b500efb77633ad2de595ea565503471cb88195038b687e35071495"} Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.711022 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfa53af81b500efb77633ad2de595ea565503471cb88195038b687e35071495" Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.711145 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554260-bkntv" Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.970056 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554254-vzflg"] Mar 11 19:00:06 crc kubenswrapper[4842]: I0311 19:00:06.976741 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554254-vzflg"] Mar 11 19:00:08 crc kubenswrapper[4842]: I0311 19:00:08.971451 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842d8359-baaa-48cc-b80f-28a6e0045e8b" path="/var/lib/kubelet/pods/842d8359-baaa-48cc-b80f-28a6e0045e8b/volumes" Mar 11 19:00:23 crc kubenswrapper[4842]: I0311 19:00:23.674053 4842 scope.go:117] "RemoveContainer" containerID="962be4f0ab85aa58ee0681d0cf5db1d1f3f7d476fd6e64c4fbd5be31d60cf079" Mar 11 19:00:23 crc kubenswrapper[4842]: I0311 19:00:23.696199 4842 scope.go:117] "RemoveContainer" containerID="a26553ba937c4505a429758298d6c48f723370ed334257b27cdce5bde464d91a" Mar 11 19:00:23 crc kubenswrapper[4842]: I0311 19:00:23.731956 4842 scope.go:117] "RemoveContainer" containerID="53625180e2224384877ad13da1cdb960582d8b35a0042f9e0bfc69a0f3b11fce" Mar 11 19:00:23 crc kubenswrapper[4842]: I0311 19:00:23.777480 4842 scope.go:117] "RemoveContainer" containerID="8f13e05c9a9edcb02f07b48ac2f49aec6281ee35d8867c0f8c2be887e237e694" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.139541 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554262-lvj4j"] Mar 11 19:02:00 crc kubenswrapper[4842]: E0311 19:02:00.140451 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa02634-f796-4d40-8bab-02ddff4401f6" containerName="collect-profiles" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.140466 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa02634-f796-4d40-8bab-02ddff4401f6" containerName="collect-profiles" Mar 11 19:02:00 crc kubenswrapper[4842]: E0311 19:02:00.140481 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6198d49-2e45-41da-9980-ada387fc0276" containerName="oc" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.140487 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6198d49-2e45-41da-9980-ada387fc0276" containerName="oc" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.140580 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa02634-f796-4d40-8bab-02ddff4401f6" containerName="collect-profiles" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.140591 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6198d49-2e45-41da-9980-ada387fc0276" containerName="oc" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.140923 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.143562 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.143551 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.148127 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.164940 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554262-lvj4j"] Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.254591 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8q9\" (UniqueName: \"kubernetes.io/projected/0bd5e323-3859-4747-94f7-20765fe176e2-kube-api-access-mr8q9\") pod \"auto-csr-approver-29554262-lvj4j\" (UID: \"0bd5e323-3859-4747-94f7-20765fe176e2\") " pod="openshift-infra/auto-csr-approver-29554262-lvj4j" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.356183 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8q9\" (UniqueName: \"kubernetes.io/projected/0bd5e323-3859-4747-94f7-20765fe176e2-kube-api-access-mr8q9\") pod \"auto-csr-approver-29554262-lvj4j\" (UID: \"0bd5e323-3859-4747-94f7-20765fe176e2\") " pod="openshift-infra/auto-csr-approver-29554262-lvj4j" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.377168 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8q9\" (UniqueName: \"kubernetes.io/projected/0bd5e323-3859-4747-94f7-20765fe176e2-kube-api-access-mr8q9\") pod \"auto-csr-approver-29554262-lvj4j\" (UID: \"0bd5e323-3859-4747-94f7-20765fe176e2\") " pod="openshift-infra/auto-csr-approver-29554262-lvj4j" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.473776 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" Mar 11 19:02:00 crc kubenswrapper[4842]: I0311 19:02:00.715558 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554262-lvj4j"] Mar 11 19:02:01 crc kubenswrapper[4842]: I0311 19:02:01.441404 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" event={"ID":"0bd5e323-3859-4747-94f7-20765fe176e2","Type":"ContainerStarted","Data":"54a653f766f80a723459ad28ea1f562fc00508d13a968599b2ae1ee57f05f743"} Mar 11 19:02:01 crc kubenswrapper[4842]: I0311 19:02:01.471912 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:02:01 crc kubenswrapper[4842]: I0311 19:02:01.472020 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:02:02 crc kubenswrapper[4842]: I0311 19:02:02.450664 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" event={"ID":"0bd5e323-3859-4747-94f7-20765fe176e2","Type":"ContainerStarted","Data":"9d36a6b1bbfc694d6c7ebf85a1e13db7373d20cabe44b67d73e4b99b23c02bb9"} Mar 11 19:02:02 crc kubenswrapper[4842]: I0311 19:02:02.476260 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" podStartSLOduration=1.30155058 podStartE2EDuration="2.476228881s" podCreationTimestamp="2026-03-11 19:02:00 +0000 UTC" firstStartedPulling="2026-03-11 19:02:00.724376665 +0000 UTC m=+766.372072935" lastFinishedPulling="2026-03-11 19:02:01.899054956 +0000 UTC m=+767.546751236" observedRunningTime="2026-03-11 19:02:02.475157291 +0000 UTC m=+768.122853601" watchObservedRunningTime="2026-03-11 19:02:02.476228881 +0000 UTC m=+768.123925171" Mar 11 19:02:03 crc kubenswrapper[4842]: I0311 19:02:03.457226 4842 generic.go:334] "Generic (PLEG): container finished" podID="0bd5e323-3859-4747-94f7-20765fe176e2" containerID="9d36a6b1bbfc694d6c7ebf85a1e13db7373d20cabe44b67d73e4b99b23c02bb9" exitCode=0 Mar 11 19:02:03 crc kubenswrapper[4842]: I0311 19:02:03.457319 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" event={"ID":"0bd5e323-3859-4747-94f7-20765fe176e2","Type":"ContainerDied","Data":"9d36a6b1bbfc694d6c7ebf85a1e13db7373d20cabe44b67d73e4b99b23c02bb9"} Mar 11 19:02:04 crc kubenswrapper[4842]: I0311 19:02:04.697176 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" Mar 11 19:02:04 crc kubenswrapper[4842]: I0311 19:02:04.818783 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr8q9\" (UniqueName: \"kubernetes.io/projected/0bd5e323-3859-4747-94f7-20765fe176e2-kube-api-access-mr8q9\") pod \"0bd5e323-3859-4747-94f7-20765fe176e2\" (UID: \"0bd5e323-3859-4747-94f7-20765fe176e2\") " Mar 11 19:02:04 crc kubenswrapper[4842]: I0311 19:02:04.826217 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd5e323-3859-4747-94f7-20765fe176e2-kube-api-access-mr8q9" (OuterVolumeSpecName: "kube-api-access-mr8q9") pod "0bd5e323-3859-4747-94f7-20765fe176e2" (UID: "0bd5e323-3859-4747-94f7-20765fe176e2"). InnerVolumeSpecName "kube-api-access-mr8q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:02:04 crc kubenswrapper[4842]: I0311 19:02:04.920811 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr8q9\" (UniqueName: \"kubernetes.io/projected/0bd5e323-3859-4747-94f7-20765fe176e2-kube-api-access-mr8q9\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:05 crc kubenswrapper[4842]: I0311 19:02:05.474077 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" event={"ID":"0bd5e323-3859-4747-94f7-20765fe176e2","Type":"ContainerDied","Data":"54a653f766f80a723459ad28ea1f562fc00508d13a968599b2ae1ee57f05f743"} Mar 11 19:02:05 crc kubenswrapper[4842]: I0311 19:02:05.474134 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a653f766f80a723459ad28ea1f562fc00508d13a968599b2ae1ee57f05f743" Mar 11 19:02:05 crc kubenswrapper[4842]: I0311 19:02:05.474186 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554262-lvj4j" Mar 11 19:02:05 crc kubenswrapper[4842]: I0311 19:02:05.775582 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554256-mx227"] Mar 11 19:02:05 crc kubenswrapper[4842]: I0311 19:02:05.778851 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554256-mx227"] Mar 11 19:02:06 crc kubenswrapper[4842]: I0311 19:02:06.970182 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad25a877-2509-4417-8ece-9413e28450a3" path="/var/lib/kubelet/pods/ad25a877-2509-4417-8ece-9413e28450a3/volumes" Mar 11 19:02:23 crc kubenswrapper[4842]: I0311 19:02:23.865396 4842 scope.go:117] "RemoveContainer" containerID="f98f6cbf392e6bf1410f92338635f01783785262d95b32645664783ad01b3511" Mar 11 19:02:31 crc kubenswrapper[4842]: I0311 19:02:31.472363 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:02:31 crc kubenswrapper[4842]: I0311 19:02:31.473346 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.591134 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm"] Mar 11 19:02:46 crc kubenswrapper[4842]: E0311 19:02:46.592700 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd5e323-3859-4747-94f7-20765fe176e2" containerName="oc" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.592769 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd5e323-3859-4747-94f7-20765fe176e2" containerName="oc" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.592900 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd5e323-3859-4747-94f7-20765fe176e2" containerName="oc" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.593658 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.595502 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.603498 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm"] Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.699096 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.699153 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc6tn\" (UniqueName: \"kubernetes.io/projected/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-kube-api-access-gc6tn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.699191 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.800749 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.801097 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc6tn\" (UniqueName: \"kubernetes.io/projected/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-kube-api-access-gc6tn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.801209 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.801411 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.801636 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.832466 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc6tn\" (UniqueName: \"kubernetes.io/projected/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-kube-api-access-gc6tn\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:46 crc kubenswrapper[4842]: I0311 19:02:46.919309 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:47 crc kubenswrapper[4842]: I0311 19:02:47.148984 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm"] Mar 11 19:02:47 crc kubenswrapper[4842]: I0311 19:02:47.772471 4842 generic.go:334] "Generic (PLEG): container finished" podID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerID="79f8a0b5ed9379aec9755c63fd6314708bdae250e5d65de71599398bf0221584" exitCode=0 Mar 11 19:02:47 crc kubenswrapper[4842]: I0311 19:02:47.772529 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" event={"ID":"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5","Type":"ContainerDied","Data":"79f8a0b5ed9379aec9755c63fd6314708bdae250e5d65de71599398bf0221584"} Mar 11 19:02:47 crc kubenswrapper[4842]: I0311 19:02:47.772570 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" event={"ID":"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5","Type":"ContainerStarted","Data":"b3b75f8a1f249b4474bfd006cc299fc78dc521b3e1c0e9a69981568afde16ab5"} Mar 11 19:02:54 crc kubenswrapper[4842]: I0311 19:02:54.808667 4842 generic.go:334] "Generic (PLEG): container finished" podID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerID="dccb4518a7838a377bd7016c65e207767d69b52e0c4684bad76650a15ce82bb0" exitCode=0 Mar 11 19:02:54 crc kubenswrapper[4842]: I0311 19:02:54.808897 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" event={"ID":"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5","Type":"ContainerDied","Data":"dccb4518a7838a377bd7016c65e207767d69b52e0c4684bad76650a15ce82bb0"} Mar 11 19:02:55 crc kubenswrapper[4842]: I0311 19:02:55.817291 4842 generic.go:334] "Generic (PLEG): container finished" podID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerID="28472cefa99da3f285941e3c379662f15ea20c17175d88dda87052d6d93f7a94" exitCode=0 Mar 11 19:02:55 crc kubenswrapper[4842]: I0311 19:02:55.817345 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" event={"ID":"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5","Type":"ContainerDied","Data":"28472cefa99da3f285941e3c379662f15ea20c17175d88dda87052d6d93f7a94"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408006 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsn92"] Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408474 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-controller" containerID="cri-o://5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408552 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="nbdb" containerID="cri-o://522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408586 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408687 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="sbdb" containerID="cri-o://2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408621 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-acl-logging" containerID="cri-o://bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408608 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-node" containerID="cri-o://f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.408751 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="northd" containerID="cri-o://ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.457597 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" containerID="cri-o://915fc3a7862038ff15ddd6b82f3fcb6a5af04baf8d5a5f62544049b2607b9f18" gracePeriod=30 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.835643 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovnkube-controller/3.log" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.842083 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovn-acl-logging/0.log" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.843119 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovn-controller/0.log" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844016 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="915fc3a7862038ff15ddd6b82f3fcb6a5af04baf8d5a5f62544049b2607b9f18" exitCode=0 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844046 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057" exitCode=0 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844058 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965" exitCode=0 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844069 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492" exitCode=0 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844082 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90" exitCode=0 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844092 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713" exitCode=0 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844104 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e" exitCode=143 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844114 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerID="5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b" exitCode=143 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844123 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"915fc3a7862038ff15ddd6b82f3fcb6a5af04baf8d5a5f62544049b2607b9f18"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844187 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844209 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844230 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844248 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844299 4842 scope.go:117] "RemoveContainer" containerID="2feed929efa75ab6ee70beb341cc19fa55e8e0a05c6193db1b6210ec4cd36ab1" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844313 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844468 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.844489 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.847196 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/2.log" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.847748 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/1.log" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.847788 4842 generic.go:334] "Generic (PLEG): container finished" podID="3827ef7b-1abd-4dea-acf3-474eed7b3860" containerID="8495aa5c205b0ecc00ef526be4fec937aacfe0fca4b2ed45604286f61edd1612" exitCode=2 Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.847961 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerDied","Data":"8495aa5c205b0ecc00ef526be4fec937aacfe0fca4b2ed45604286f61edd1612"} Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.849135 4842 scope.go:117] "RemoveContainer" containerID="8495aa5c205b0ecc00ef526be4fec937aacfe0fca4b2ed45604286f61edd1612" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.873589 4842 scope.go:117] "RemoveContainer" containerID="42e3acef3aae896aa0b112817dc87fa20428aa434ede7b5a7fcbdc160bd4174a" Mar 11 19:02:56 crc kubenswrapper[4842]: I0311 19:02:56.897655 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.062701 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc6tn\" (UniqueName: \"kubernetes.io/projected/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-kube-api-access-gc6tn\") pod \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.062757 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-util\") pod \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.062870 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-bundle\") pod \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\" (UID: \"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.064042 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-bundle" (OuterVolumeSpecName: "bundle") pod "edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" (UID: "edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.071366 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-kube-api-access-gc6tn" (OuterVolumeSpecName: "kube-api-access-gc6tn") pod "edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" (UID: "edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5"). InnerVolumeSpecName "kube-api-access-gc6tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.078551 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-util" (OuterVolumeSpecName: "util") pod "edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" (UID: "edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.164732 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc6tn\" (UniqueName: \"kubernetes.io/projected/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-kube-api-access-gc6tn\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.164772 4842 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-util\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.164786 4842 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.285349 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovn-acl-logging/0.log" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.286018 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovn-controller/0.log" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.286809 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360239 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p88hb"] Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360552 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360569 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360586 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="sbdb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360593 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="sbdb" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360606 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360616 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360624 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-node" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360632 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-node" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360641 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerName="util" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360649 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerName="util" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360661 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360669 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360681 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kubecfg-setup" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360689 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kubecfg-setup" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360701 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-acl-logging" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360710 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-acl-logging" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360722 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="nbdb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360730 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="nbdb" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360738 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360745 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360757 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="northd" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360764 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="northd" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360775 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerName="pull" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360782 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerName="pull" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360793 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360800 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.360810 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerName="extract" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360817 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerName="extract" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360919 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360934 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-acl-logging" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360945 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-node" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360954 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360965 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="northd" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360973 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360981 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="nbdb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360989 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="sbdb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.360998 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5" containerName="extract" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.361010 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovn-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.361193 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.361213 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.361414 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.361427 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: E0311 19:02:57.361439 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.361447 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.361555 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" containerName="ovnkube-controller" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.363589 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.469840 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-script-lib\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.469908 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-etc-openvswitch\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.469936 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-log-socket\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.469958 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-env-overrides\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470004 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-netns\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470028 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-ovn-kubernetes\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470019 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470049 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470084 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovn-node-metrics-cert\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470122 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-ovn\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470144 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-slash\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470166 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-bin\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470204 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-openvswitch\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470222 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-node-log\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470241 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-var-lib-openvswitch\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470294 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-685hx\" (UniqueName: \"kubernetes.io/projected/5c32da15-9b98-45c1-be42-d7d0e89428c5-kube-api-access-685hx\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470334 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-kubelet\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470379 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-systemd\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470416 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-config\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470446 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-netd\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470468 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-systemd-units\") pod \"5c32da15-9b98-45c1-be42-d7d0e89428c5\" (UID: \"5c32da15-9b98-45c1-be42-d7d0e89428c5\") " Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470466 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470475 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-log-socket" (OuterVolumeSpecName: "log-socket") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470501 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470516 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470551 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470554 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470571 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470574 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470599 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-node-log" (OuterVolumeSpecName: "node-log") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470629 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470673 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470778 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.470964 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-systemd-units\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471226 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-slash" (OuterVolumeSpecName: "host-slash") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471335 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-var-lib-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471409 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-cni-bin\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471241 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471305 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471262 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471450 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-systemd\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471542 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-cni-netd\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471594 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471658 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-etc-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471830 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-ovn\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471910 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-ovnkube-script-lib\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.471987 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-log-socket\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472041 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-run-netns\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472072 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7frgp\" (UniqueName: \"kubernetes.io/projected/0df03b05-5e61-410e-b300-2badef3f6a03-kube-api-access-7frgp\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472103 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-run-ovn-kubernetes\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472126 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0df03b05-5e61-410e-b300-2badef3f6a03-ovn-node-metrics-cert\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472219 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-env-overrides\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472253 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-kubelet\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472330 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-ovnkube-config\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472405 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472445 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-slash\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472499 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-node-log\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472625 4842 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472648 4842 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472665 4842 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472682 4842 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472699 4842 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472716 4842 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472733 4842 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-log-socket\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472792 4842 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c32da15-9b98-45c1-be42-d7d0e89428c5-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472814 4842 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472833 4842 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472853 4842 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472872 4842 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472889 4842 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-slash\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472905 4842 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472924 4842 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472940 4842 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-node-log\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.472957 4842 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.475061 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.475418 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c32da15-9b98-45c1-be42-d7d0e89428c5-kube-api-access-685hx" (OuterVolumeSpecName: "kube-api-access-685hx") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "kube-api-access-685hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.484553 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5c32da15-9b98-45c1-be42-d7d0e89428c5" (UID: "5c32da15-9b98-45c1-be42-d7d0e89428c5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573818 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-env-overrides\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573869 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-kubelet\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573887 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-ovnkube-config\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573911 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573934 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-slash\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573953 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-node-log\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573973 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-systemd-units\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.573995 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-var-lib-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574027 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-cni-bin\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574050 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-systemd\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574061 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-slash\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574070 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-cni-netd\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574100 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-cni-netd\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574125 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574126 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-var-lib-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574141 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-cni-bin\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574163 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-systemd\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574185 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-etc-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574193 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574157 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-etc-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574217 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-systemd-units\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574198 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-kubelet\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574230 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-ovn\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574227 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-openvswitch\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574265 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-run-ovn\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574293 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-ovnkube-script-lib\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574258 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-node-log\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574353 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-log-socket\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574403 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-log-socket\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574430 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-run-netns\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574454 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7frgp\" (UniqueName: \"kubernetes.io/projected/0df03b05-5e61-410e-b300-2badef3f6a03-kube-api-access-7frgp\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574476 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-run-ovn-kubernetes\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574489 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-run-netns\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574501 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0df03b05-5e61-410e-b300-2badef3f6a03-ovn-node-metrics-cert\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574545 4842 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c32da15-9b98-45c1-be42-d7d0e89428c5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574560 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-685hx\" (UniqueName: \"kubernetes.io/projected/5c32da15-9b98-45c1-be42-d7d0e89428c5-kube-api-access-685hx\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574574 4842 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c32da15-9b98-45c1-be42-d7d0e89428c5-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574557 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0df03b05-5e61-410e-b300-2badef3f6a03-host-run-ovn-kubernetes\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.574993 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-env-overrides\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.575545 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-ovnkube-script-lib\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.575542 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0df03b05-5e61-410e-b300-2badef3f6a03-ovnkube-config\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.579707 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0df03b05-5e61-410e-b300-2badef3f6a03-ovn-node-metrics-cert\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.594973 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7frgp\" (UniqueName: \"kubernetes.io/projected/0df03b05-5e61-410e-b300-2badef3f6a03-kube-api-access-7frgp\") pod \"ovnkube-node-p88hb\" (UID: \"0df03b05-5e61-410e-b300-2badef3f6a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.691548 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:02:57 crc kubenswrapper[4842]: W0311 19:02:57.716334 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df03b05_5e61_410e_b300_2badef3f6a03.slice/crio-85fe54bdad32ba2367f49a58f6591bfac3b6103bb1ee6fa525e8db68bb973dfe WatchSource:0}: Error finding container 85fe54bdad32ba2367f49a58f6591bfac3b6103bb1ee6fa525e8db68bb973dfe: Status 404 returned error can't find the container with id 85fe54bdad32ba2367f49a58f6591bfac3b6103bb1ee6fa525e8db68bb973dfe Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.859482 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovn-acl-logging/0.log" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.859933 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsn92_5c32da15-9b98-45c1-be42-d7d0e89428c5/ovn-controller/0.log" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.860294 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" event={"ID":"5c32da15-9b98-45c1-be42-d7d0e89428c5","Type":"ContainerDied","Data":"6fcd2e0332fdba0c5fb56f8075f1798a46587dde93fb7f43b4cf9bf41603bead"} Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.860335 4842 scope.go:117] "RemoveContainer" containerID="915fc3a7862038ff15ddd6b82f3fcb6a5af04baf8d5a5f62544049b2607b9f18" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.860462 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsn92" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.863441 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"85fe54bdad32ba2367f49a58f6591bfac3b6103bb1ee6fa525e8db68bb973dfe"} Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.866377 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2hhn6_3827ef7b-1abd-4dea-acf3-474eed7b3860/kube-multus/2.log" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.866466 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2hhn6" event={"ID":"3827ef7b-1abd-4dea-acf3-474eed7b3860","Type":"ContainerStarted","Data":"d5524b62f7f1ec59739137e558032442126cd5f38332f6f6432f16c47e2d7f6f"} Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.868740 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" event={"ID":"edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5","Type":"ContainerDied","Data":"b3b75f8a1f249b4474bfd006cc299fc78dc521b3e1c0e9a69981568afde16ab5"} Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.868779 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b75f8a1f249b4474bfd006cc299fc78dc521b3e1c0e9a69981568afde16ab5" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.868793 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.881470 4842 scope.go:117] "RemoveContainer" containerID="2fefbb3155611215e84b42921aafe014f5be4fc9082db1892dce00f2df709057" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.918199 4842 scope.go:117] "RemoveContainer" containerID="522e7ef10a115849a60604a0be41b28fe0201d5ce30f2cab5a5eb39f8c2b7965" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.920373 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsn92"] Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.928434 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsn92"] Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.940723 4842 scope.go:117] "RemoveContainer" containerID="ea6be4b30805d675f8551722b44e4a21003a4767d3371b061b6cbdedc65f0492" Mar 11 19:02:57 crc kubenswrapper[4842]: I0311 19:02:57.993791 4842 scope.go:117] "RemoveContainer" containerID="665c78013595e4ca345ba66a4d4e90719a48e1cd6c754339b5cf7c0b130dae90" Mar 11 19:02:58 crc kubenswrapper[4842]: I0311 19:02:58.010716 4842 scope.go:117] "RemoveContainer" containerID="f7cf7cbd287837d134175a8a67b35164cc3d092146afcdd50b8f9519f0f3b713" Mar 11 19:02:58 crc kubenswrapper[4842]: I0311 19:02:58.022900 4842 scope.go:117] "RemoveContainer" containerID="bfc9bf8b715f660e6853071b75280cbb95725a5dcb5de0c0fdca08753fd6196e" Mar 11 19:02:58 crc kubenswrapper[4842]: I0311 19:02:58.034215 4842 scope.go:117] "RemoveContainer" containerID="5b2a54bbbebc276b4ec87a4b84b66ddcc73690c97df24276ee240cb6703acf8b" Mar 11 19:02:58 crc kubenswrapper[4842]: I0311 19:02:58.047185 4842 scope.go:117] "RemoveContainer" containerID="2a759b0317cb83de857557aefec08da46907a0cafa319dc65e3ef171c3019bd5" Mar 11 19:02:58 crc kubenswrapper[4842]: I0311 19:02:58.878413 4842 generic.go:334] "Generic (PLEG): container finished" podID="0df03b05-5e61-410e-b300-2badef3f6a03" containerID="6608e2941898e8c8513f9b9bec7629797f102e012df0d0b9603757de3d9df576" exitCode=0 Mar 11 19:02:58 crc kubenswrapper[4842]: I0311 19:02:58.878455 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerDied","Data":"6608e2941898e8c8513f9b9bec7629797f102e012df0d0b9603757de3d9df576"} Mar 11 19:02:58 crc kubenswrapper[4842]: I0311 19:02:58.971104 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c32da15-9b98-45c1-be42-d7d0e89428c5" path="/var/lib/kubelet/pods/5c32da15-9b98-45c1-be42-d7d0e89428c5/volumes" Mar 11 19:02:59 crc kubenswrapper[4842]: I0311 19:02:59.890183 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"56d2622a35def7c620d6bcf1aeb09cdcb3d7f1263e32a713eb55b05bd0ff19c8"} Mar 11 19:02:59 crc kubenswrapper[4842]: I0311 19:02:59.890647 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"17e511f1414ed268d6d6a3d20685a1c36d9bb0ebd2befb560329d0e209b40e76"} Mar 11 19:02:59 crc kubenswrapper[4842]: I0311 19:02:59.890672 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"67ba424497fd4e7c8f5896ec8254e912cb353969174b1a09e34133e1b960a9c9"} Mar 11 19:02:59 crc kubenswrapper[4842]: I0311 19:02:59.890691 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"e349806345b479819ef9af1d8bf068be3e03b4357d95d78c1a51f34f916da71d"} Mar 11 19:02:59 crc kubenswrapper[4842]: I0311 19:02:59.890709 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"4c2f5c1f45332a7da526ce00e00d242a4ec36d09f389d79e78259abefb69aad7"} Mar 11 19:02:59 crc kubenswrapper[4842]: I0311 19:02:59.890726 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"05c68cb6b5a9f33c34fe43dc83cb087614a8aa784fd3cfa66dc9fd3605ce2699"} Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.471890 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.471982 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.472060 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.473007 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd17d155f0763fe0e3f142ca18755ab7a2e8fd0c5e83a7bdd2e0037d15a4c528"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.473104 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://dd17d155f0763fe0e3f142ca18755ab7a2e8fd0c5e83a7bdd2e0037d15a4c528" gracePeriod=600 Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.910692 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="dd17d155f0763fe0e3f142ca18755ab7a2e8fd0c5e83a7bdd2e0037d15a4c528" exitCode=0 Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.910777 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"dd17d155f0763fe0e3f142ca18755ab7a2e8fd0c5e83a7bdd2e0037d15a4c528"} Mar 11 19:03:01 crc kubenswrapper[4842]: I0311 19:03:01.911037 4842 scope.go:117] "RemoveContainer" containerID="fd8f5bc1cb45867bb48f671ce0a50d33b14c76234538c7c5a98f898ebef3f305" Mar 11 19:03:02 crc kubenswrapper[4842]: I0311 19:03:02.919133 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"741cee34022ccdac48b6c603ba201ced3a7f7803c4c8a38143440982a01cfafb"} Mar 11 19:03:02 crc kubenswrapper[4842]: I0311 19:03:02.922944 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"a131c68f1b61d8a227091c60f953e816a9fb1af99eb569c707ce579e07d9174b"} Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.075042 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4"] Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.075820 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.077550 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.077779 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.078499 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-d8w8s" Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.144812 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltdv\" (UniqueName: \"kubernetes.io/projected/db180dd9-3a80-4f69-a4f9-ffbf52edfe72-kube-api-access-qltdv\") pod \"nmstate-operator-796d4cfff4-zm8f4\" (UID: \"db180dd9-3a80-4f69-a4f9-ffbf52edfe72\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.245593 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qltdv\" (UniqueName: \"kubernetes.io/projected/db180dd9-3a80-4f69-a4f9-ffbf52edfe72-kube-api-access-qltdv\") pod \"nmstate-operator-796d4cfff4-zm8f4\" (UID: \"db180dd9-3a80-4f69-a4f9-ffbf52edfe72\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.269990 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltdv\" (UniqueName: \"kubernetes.io/projected/db180dd9-3a80-4f69-a4f9-ffbf52edfe72-kube-api-access-qltdv\") pod \"nmstate-operator-796d4cfff4-zm8f4\" (UID: \"db180dd9-3a80-4f69-a4f9-ffbf52edfe72\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:03 crc kubenswrapper[4842]: I0311 19:03:03.389319 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:03 crc kubenswrapper[4842]: E0311 19:03:03.422954 4842 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(2af209248dc8704145edceb77355e3c8428c8930f25f92e7bfadef0ffa05b605): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 19:03:03 crc kubenswrapper[4842]: E0311 19:03:03.423031 4842 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(2af209248dc8704145edceb77355e3c8428c8930f25f92e7bfadef0ffa05b605): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:03 crc kubenswrapper[4842]: E0311 19:03:03.423059 4842 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(2af209248dc8704145edceb77355e3c8428c8930f25f92e7bfadef0ffa05b605): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:03 crc kubenswrapper[4842]: E0311 19:03:03.423106 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate(db180dd9-3a80-4f69-a4f9-ffbf52edfe72)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate(db180dd9-3a80-4f69-a4f9-ffbf52edfe72)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(2af209248dc8704145edceb77355e3c8428c8930f25f92e7bfadef0ffa05b605): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" podUID="db180dd9-3a80-4f69-a4f9-ffbf52edfe72" Mar 11 19:03:04 crc kubenswrapper[4842]: I0311 19:03:04.937348 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" event={"ID":"0df03b05-5e61-410e-b300-2badef3f6a03","Type":"ContainerStarted","Data":"14118c091a1c7a17542946306adebb23510e70d5d515697d4cf157846e9b6ced"} Mar 11 19:03:04 crc kubenswrapper[4842]: I0311 19:03:04.937779 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:03:04 crc kubenswrapper[4842]: I0311 19:03:04.937792 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:03:04 crc kubenswrapper[4842]: I0311 19:03:04.937801 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:03:04 crc kubenswrapper[4842]: I0311 19:03:04.963562 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" podStartSLOduration=7.963547556 podStartE2EDuration="7.963547556s" podCreationTimestamp="2026-03-11 19:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:03:04.961322978 +0000 UTC m=+830.609019268" watchObservedRunningTime="2026-03-11 19:03:04.963547556 +0000 UTC m=+830.611243826" Mar 11 19:03:05 crc kubenswrapper[4842]: I0311 19:03:05.012422 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:03:05 crc kubenswrapper[4842]: I0311 19:03:05.012673 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:03:05 crc kubenswrapper[4842]: I0311 19:03:05.596982 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4"] Mar 11 19:03:05 crc kubenswrapper[4842]: I0311 19:03:05.597416 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:05 crc kubenswrapper[4842]: I0311 19:03:05.597844 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:05 crc kubenswrapper[4842]: E0311 19:03:05.619865 4842 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(3a068cca0e01954cc3795fa680a44e90f90623e0b96a4c20c4ab7ac55575ae28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 19:03:05 crc kubenswrapper[4842]: E0311 19:03:05.619932 4842 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(3a068cca0e01954cc3795fa680a44e90f90623e0b96a4c20c4ab7ac55575ae28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:05 crc kubenswrapper[4842]: E0311 19:03:05.619956 4842 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(3a068cca0e01954cc3795fa680a44e90f90623e0b96a4c20c4ab7ac55575ae28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:05 crc kubenswrapper[4842]: E0311 19:03:05.620004 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate(db180dd9-3a80-4f69-a4f9-ffbf52edfe72)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate(db180dd9-3a80-4f69-a4f9-ffbf52edfe72)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_nmstate-operator-796d4cfff4-zm8f4_openshift-nmstate_db180dd9-3a80-4f69-a4f9-ffbf52edfe72_0(3a068cca0e01954cc3795fa680a44e90f90623e0b96a4c20c4ab7ac55575ae28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" podUID="db180dd9-3a80-4f69-a4f9-ffbf52edfe72" Mar 11 19:03:12 crc kubenswrapper[4842]: I0311 19:03:12.448425 4842 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 19:03:18 crc kubenswrapper[4842]: I0311 19:03:18.961867 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:18 crc kubenswrapper[4842]: I0311 19:03:18.963130 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" Mar 11 19:03:19 crc kubenswrapper[4842]: I0311 19:03:19.163879 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4"] Mar 11 19:03:19 crc kubenswrapper[4842]: W0311 19:03:19.167756 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb180dd9_3a80_4f69_a4f9_ffbf52edfe72.slice/crio-e39256abe62623f7474958f18e10491216a20e82c0b6b16a5a04c2a36fe33802 WatchSource:0}: Error finding container e39256abe62623f7474958f18e10491216a20e82c0b6b16a5a04c2a36fe33802: Status 404 returned error can't find the container with id e39256abe62623f7474958f18e10491216a20e82c0b6b16a5a04c2a36fe33802 Mar 11 19:03:20 crc kubenswrapper[4842]: I0311 19:03:20.033507 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" event={"ID":"db180dd9-3a80-4f69-a4f9-ffbf52edfe72","Type":"ContainerStarted","Data":"e39256abe62623f7474958f18e10491216a20e82c0b6b16a5a04c2a36fe33802"} Mar 11 19:03:22 crc kubenswrapper[4842]: I0311 19:03:22.049762 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" event={"ID":"db180dd9-3a80-4f69-a4f9-ffbf52edfe72","Type":"ContainerStarted","Data":"2899fc4452e2d0398d9e07bfc000554fb47348efa3eb08fe915bd39c2afbeb8b"} Mar 11 19:03:22 crc kubenswrapper[4842]: I0311 19:03:22.077976 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zm8f4" podStartSLOduration=16.733321623 podStartE2EDuration="19.07794409s" podCreationTimestamp="2026-03-11 19:03:03 +0000 UTC" firstStartedPulling="2026-03-11 19:03:19.169620083 +0000 UTC m=+844.817316403" lastFinishedPulling="2026-03-11 19:03:21.51424259 +0000 UTC m=+847.161938870" observedRunningTime="2026-03-11 19:03:22.077175669 +0000 UTC m=+847.724871979" watchObservedRunningTime="2026-03-11 19:03:22.07794409 +0000 UTC m=+847.725640400" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.196395 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.197803 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.200649 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wc886" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.208201 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.208986 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.228235 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.244561 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.250086 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.303344 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-nhcr2"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.304219 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.318823 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6nv\" (UniqueName: \"kubernetes.io/projected/ff0ba288-4474-4f9f-bf10-e4955b9142a0-kube-api-access-9d6nv\") pod \"nmstate-metrics-9b8c8685d-2tbdg\" (UID: \"ff0ba288-4474-4f9f-bf10-e4955b9142a0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.319043 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/405456d2-dac9-4c6b-93fb-ef142b02cd7e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rvwdc\" (UID: \"405456d2-dac9-4c6b-93fb-ef142b02cd7e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.319144 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llgnv\" (UniqueName: \"kubernetes.io/projected/405456d2-dac9-4c6b-93fb-ef142b02cd7e-kube-api-access-llgnv\") pod \"nmstate-webhook-5f558f5558-rvwdc\" (UID: \"405456d2-dac9-4c6b-93fb-ef142b02cd7e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.390141 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.390925 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.392967 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-nrkrr" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.394059 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.394332 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.409710 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.420244 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-nmstate-lock\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.420317 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llgnv\" (UniqueName: \"kubernetes.io/projected/405456d2-dac9-4c6b-93fb-ef142b02cd7e-kube-api-access-llgnv\") pod \"nmstate-webhook-5f558f5558-rvwdc\" (UID: \"405456d2-dac9-4c6b-93fb-ef142b02cd7e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.420339 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-ovs-socket\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.420372 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z8l\" (UniqueName: \"kubernetes.io/projected/caae1282-da5e-4162-960f-500306facaf1-kube-api-access-q6z8l\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.420407 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-dbus-socket\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.420424 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6nv\" (UniqueName: \"kubernetes.io/projected/ff0ba288-4474-4f9f-bf10-e4955b9142a0-kube-api-access-9d6nv\") pod \"nmstate-metrics-9b8c8685d-2tbdg\" (UID: \"ff0ba288-4474-4f9f-bf10-e4955b9142a0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.420440 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/405456d2-dac9-4c6b-93fb-ef142b02cd7e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rvwdc\" (UID: \"405456d2-dac9-4c6b-93fb-ef142b02cd7e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.430998 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/405456d2-dac9-4c6b-93fb-ef142b02cd7e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rvwdc\" (UID: \"405456d2-dac9-4c6b-93fb-ef142b02cd7e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.437430 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llgnv\" (UniqueName: \"kubernetes.io/projected/405456d2-dac9-4c6b-93fb-ef142b02cd7e-kube-api-access-llgnv\") pod \"nmstate-webhook-5f558f5558-rvwdc\" (UID: \"405456d2-dac9-4c6b-93fb-ef142b02cd7e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.442769 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6nv\" (UniqueName: \"kubernetes.io/projected/ff0ba288-4474-4f9f-bf10-e4955b9142a0-kube-api-access-9d6nv\") pod \"nmstate-metrics-9b8c8685d-2tbdg\" (UID: \"ff0ba288-4474-4f9f-bf10-e4955b9142a0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.515864 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522030 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-ovs-socket\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522095 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4af38b63-4aad-4175-a610-44575dda0d08-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522116 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-ovs-socket\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522156 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6z8l\" (UniqueName: \"kubernetes.io/projected/caae1282-da5e-4162-960f-500306facaf1-kube-api-access-q6z8l\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522260 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5hqd\" (UniqueName: \"kubernetes.io/projected/4af38b63-4aad-4175-a610-44575dda0d08-kube-api-access-k5hqd\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522313 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4af38b63-4aad-4175-a610-44575dda0d08-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522341 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-dbus-socket\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522392 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-nmstate-lock\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522489 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-nmstate-lock\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.522790 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/caae1282-da5e-4162-960f-500306facaf1-dbus-socket\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.527990 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.546746 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6z8l\" (UniqueName: \"kubernetes.io/projected/caae1282-da5e-4162-960f-500306facaf1-kube-api-access-q6z8l\") pod \"nmstate-handler-nhcr2\" (UID: \"caae1282-da5e-4162-960f-500306facaf1\") " pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.583019 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77b59f9678-2s2xc"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.583672 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627122 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-console-config\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627409 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm42l\" (UniqueName: \"kubernetes.io/projected/2d5ca049-bb3f-47e8-9945-363c19752e00-kube-api-access-wm42l\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627438 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4af38b63-4aad-4175-a610-44575dda0d08-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627471 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d5ca049-bb3f-47e8-9945-363c19752e00-console-oauth-config\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627501 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5hqd\" (UniqueName: \"kubernetes.io/projected/4af38b63-4aad-4175-a610-44575dda0d08-kube-api-access-k5hqd\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627519 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-service-ca\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627534 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d5ca049-bb3f-47e8-9945-363c19752e00-console-serving-cert\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627551 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4af38b63-4aad-4175-a610-44575dda0d08-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627577 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-oauth-serving-cert\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.627596 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-trusted-ca-bundle\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: E0311 19:03:23.627708 4842 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 11 19:03:23 crc kubenswrapper[4842]: E0311 19:03:23.627750 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af38b63-4aad-4175-a610-44575dda0d08-plugin-serving-cert podName:4af38b63-4aad-4175-a610-44575dda0d08 nodeName:}" failed. No retries permitted until 2026-03-11 19:03:24.127735084 +0000 UTC m=+849.775431364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/4af38b63-4aad-4175-a610-44575dda0d08-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-8vdjz" (UID: "4af38b63-4aad-4175-a610-44575dda0d08") : secret "plugin-serving-cert" not found Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.631523 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b59f9678-2s2xc"] Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.632307 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4af38b63-4aad-4175-a610-44575dda0d08-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.650860 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.656059 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5hqd\" (UniqueName: \"kubernetes.io/projected/4af38b63-4aad-4175-a610-44575dda0d08-kube-api-access-k5hqd\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.729491 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm42l\" (UniqueName: \"kubernetes.io/projected/2d5ca049-bb3f-47e8-9945-363c19752e00-kube-api-access-wm42l\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.729585 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d5ca049-bb3f-47e8-9945-363c19752e00-console-oauth-config\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.729621 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-service-ca\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.729638 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d5ca049-bb3f-47e8-9945-363c19752e00-console-serving-cert\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.729670 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-oauth-serving-cert\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.729691 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-trusted-ca-bundle\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.729708 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-console-config\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.730566 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-console-config\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.731844 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-service-ca\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.731968 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-oauth-serving-cert\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.732983 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d5ca049-bb3f-47e8-9945-363c19752e00-trusted-ca-bundle\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.733163 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d5ca049-bb3f-47e8-9945-363c19752e00-console-serving-cert\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.736903 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d5ca049-bb3f-47e8-9945-363c19752e00-console-oauth-config\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.744654 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm42l\" (UniqueName: \"kubernetes.io/projected/2d5ca049-bb3f-47e8-9945-363c19752e00-kube-api-access-wm42l\") pod \"console-77b59f9678-2s2xc\" (UID: \"2d5ca049-bb3f-47e8-9945-363c19752e00\") " pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.909131 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:23 crc kubenswrapper[4842]: I0311 19:03:23.972971 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg"] Mar 11 19:03:23 crc kubenswrapper[4842]: W0311 19:03:23.983834 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0ba288_4474_4f9f_bf10_e4955b9142a0.slice/crio-75c105862bbaf6aca1a10bcecb0eea7ac00de9b73ea361be04788674ebf8a757 WatchSource:0}: Error finding container 75c105862bbaf6aca1a10bcecb0eea7ac00de9b73ea361be04788674ebf8a757: Status 404 returned error can't find the container with id 75c105862bbaf6aca1a10bcecb0eea7ac00de9b73ea361be04788674ebf8a757 Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.035584 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc"] Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.058488 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" event={"ID":"ff0ba288-4474-4f9f-bf10-e4955b9142a0","Type":"ContainerStarted","Data":"75c105862bbaf6aca1a10bcecb0eea7ac00de9b73ea361be04788674ebf8a757"} Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.059090 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nhcr2" event={"ID":"caae1282-da5e-4162-960f-500306facaf1","Type":"ContainerStarted","Data":"d752234dc7a964f8773f48184ed29e0cac9148d7ed713b48a3a60f65b2cf7f2b"} Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.060630 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" event={"ID":"405456d2-dac9-4c6b-93fb-ef142b02cd7e","Type":"ContainerStarted","Data":"7b267fa15d18ada06757184578f14de6fef98d2ed30661fe1e64dbf2f55a1227"} Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.125887 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b59f9678-2s2xc"] Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.134712 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4af38b63-4aad-4175-a610-44575dda0d08-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:24 crc kubenswrapper[4842]: W0311 19:03:24.135874 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d5ca049_bb3f_47e8_9945_363c19752e00.slice/crio-6eedd5e0b7a46709b8f4b58777d9f99779940190f77bcfcbcf53870703cfc418 WatchSource:0}: Error finding container 6eedd5e0b7a46709b8f4b58777d9f99779940190f77bcfcbcf53870703cfc418: Status 404 returned error can't find the container with id 6eedd5e0b7a46709b8f4b58777d9f99779940190f77bcfcbcf53870703cfc418 Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.139419 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4af38b63-4aad-4175-a610-44575dda0d08-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8vdjz\" (UID: \"4af38b63-4aad-4175-a610-44575dda0d08\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.309305 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" Mar 11 19:03:24 crc kubenswrapper[4842]: I0311 19:03:24.529677 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz"] Mar 11 19:03:24 crc kubenswrapper[4842]: W0311 19:03:24.535499 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af38b63_4aad_4175_a610_44575dda0d08.slice/crio-af1294dadbf5a9a66dafad182aff84ad9c79d20222757cb1e20b2dc206018a49 WatchSource:0}: Error finding container af1294dadbf5a9a66dafad182aff84ad9c79d20222757cb1e20b2dc206018a49: Status 404 returned error can't find the container with id af1294dadbf5a9a66dafad182aff84ad9c79d20222757cb1e20b2dc206018a49 Mar 11 19:03:25 crc kubenswrapper[4842]: I0311 19:03:25.067494 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b59f9678-2s2xc" event={"ID":"2d5ca049-bb3f-47e8-9945-363c19752e00","Type":"ContainerStarted","Data":"3baf61a1ef2a0d67dbc9fd2009ccdc602bf61a16fcdfd8c2e82902c51445c0e1"} Mar 11 19:03:25 crc kubenswrapper[4842]: I0311 19:03:25.067917 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b59f9678-2s2xc" event={"ID":"2d5ca049-bb3f-47e8-9945-363c19752e00","Type":"ContainerStarted","Data":"6eedd5e0b7a46709b8f4b58777d9f99779940190f77bcfcbcf53870703cfc418"} Mar 11 19:03:25 crc kubenswrapper[4842]: I0311 19:03:25.069615 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" event={"ID":"4af38b63-4aad-4175-a610-44575dda0d08","Type":"ContainerStarted","Data":"af1294dadbf5a9a66dafad182aff84ad9c79d20222757cb1e20b2dc206018a49"} Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.079494 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" event={"ID":"405456d2-dac9-4c6b-93fb-ef142b02cd7e","Type":"ContainerStarted","Data":"23b06acc0bb2f85446d03860e3242faf4d5d85c48e01304b42e18ddaed022e15"} Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.079832 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.080358 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" event={"ID":"ff0ba288-4474-4f9f-bf10-e4955b9142a0","Type":"ContainerStarted","Data":"6bc8e513f4c1668d3478b2d989256941203fa88cd068e73f16bb7036d1f236ff"} Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.081302 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nhcr2" event={"ID":"caae1282-da5e-4162-960f-500306facaf1","Type":"ContainerStarted","Data":"9339b94e5c29cbb82840c6cfb70615d6f72b9e341a03029a86207023ee691982"} Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.081805 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.101729 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77b59f9678-2s2xc" podStartSLOduration=4.101710728 podStartE2EDuration="4.101710728s" podCreationTimestamp="2026-03-11 19:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:03:25.085970421 +0000 UTC m=+850.733666721" watchObservedRunningTime="2026-03-11 19:03:27.101710728 +0000 UTC m=+852.749407008" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.103217 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" podStartSLOduration=1.529315413 podStartE2EDuration="4.103207879s" podCreationTimestamp="2026-03-11 19:03:23 +0000 UTC" firstStartedPulling="2026-03-11 19:03:24.04798728 +0000 UTC m=+849.695683570" lastFinishedPulling="2026-03-11 19:03:26.621879756 +0000 UTC m=+852.269576036" observedRunningTime="2026-03-11 19:03:27.095161619 +0000 UTC m=+852.742857899" watchObservedRunningTime="2026-03-11 19:03:27.103207879 +0000 UTC m=+852.750904159" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.113446 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-nhcr2" podStartSLOduration=1.136182536 podStartE2EDuration="4.113427777s" podCreationTimestamp="2026-03-11 19:03:23 +0000 UTC" firstStartedPulling="2026-03-11 19:03:23.689348555 +0000 UTC m=+849.337044835" lastFinishedPulling="2026-03-11 19:03:26.666593766 +0000 UTC m=+852.314290076" observedRunningTime="2026-03-11 19:03:27.108199535 +0000 UTC m=+852.755895815" watchObservedRunningTime="2026-03-11 19:03:27.113427777 +0000 UTC m=+852.761124057" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.717089 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p88hb" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.941199 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fp8p4"] Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.950480 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.965806 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp8p4"] Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.980632 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-catalog-content\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.980684 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d78lz\" (UniqueName: \"kubernetes.io/projected/44d7713b-d257-4c1e-bf42-52d4134af893-kube-api-access-d78lz\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:27 crc kubenswrapper[4842]: I0311 19:03:27.981171 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-utilities\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.082020 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-catalog-content\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.082566 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d78lz\" (UniqueName: \"kubernetes.io/projected/44d7713b-d257-4c1e-bf42-52d4134af893-kube-api-access-d78lz\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.082520 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-catalog-content\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.082666 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-utilities\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.083006 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-utilities\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.088204 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" event={"ID":"4af38b63-4aad-4175-a610-44575dda0d08","Type":"ContainerStarted","Data":"61f503f6e420d489835be63ef8c37407aff499a8291189791ddca5598d792996"} Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.107687 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8vdjz" podStartSLOduration=2.132870131 podStartE2EDuration="5.107670895s" podCreationTimestamp="2026-03-11 19:03:23 +0000 UTC" firstStartedPulling="2026-03-11 19:03:24.537336342 +0000 UTC m=+850.185032622" lastFinishedPulling="2026-03-11 19:03:27.512137106 +0000 UTC m=+853.159833386" observedRunningTime="2026-03-11 19:03:28.104902619 +0000 UTC m=+853.752598899" watchObservedRunningTime="2026-03-11 19:03:28.107670895 +0000 UTC m=+853.755367175" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.112036 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d78lz\" (UniqueName: \"kubernetes.io/projected/44d7713b-d257-4c1e-bf42-52d4134af893-kube-api-access-d78lz\") pod \"redhat-marketplace-fp8p4\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.274367 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:28 crc kubenswrapper[4842]: I0311 19:03:28.509312 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp8p4"] Mar 11 19:03:28 crc kubenswrapper[4842]: W0311 19:03:28.516823 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d7713b_d257_4c1e_bf42_52d4134af893.slice/crio-f9e1bc4c812cf6b97e167c05c3de4aa2c7246e77affdc8f24fc999e7318865b7 WatchSource:0}: Error finding container f9e1bc4c812cf6b97e167c05c3de4aa2c7246e77affdc8f24fc999e7318865b7: Status 404 returned error can't find the container with id f9e1bc4c812cf6b97e167c05c3de4aa2c7246e77affdc8f24fc999e7318865b7 Mar 11 19:03:29 crc kubenswrapper[4842]: I0311 19:03:29.094088 4842 generic.go:334] "Generic (PLEG): container finished" podID="44d7713b-d257-4c1e-bf42-52d4134af893" containerID="725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91" exitCode=0 Mar 11 19:03:29 crc kubenswrapper[4842]: I0311 19:03:29.094214 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp8p4" event={"ID":"44d7713b-d257-4c1e-bf42-52d4134af893","Type":"ContainerDied","Data":"725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91"} Mar 11 19:03:29 crc kubenswrapper[4842]: I0311 19:03:29.094266 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp8p4" event={"ID":"44d7713b-d257-4c1e-bf42-52d4134af893","Type":"ContainerStarted","Data":"f9e1bc4c812cf6b97e167c05c3de4aa2c7246e77affdc8f24fc999e7318865b7"} Mar 11 19:03:30 crc kubenswrapper[4842]: I0311 19:03:30.102630 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" event={"ID":"ff0ba288-4474-4f9f-bf10-e4955b9142a0","Type":"ContainerStarted","Data":"3d5e01d4a6287151ad3815abd22de0fd616905734f35d25c306da87ba19a243f"} Mar 11 19:03:30 crc kubenswrapper[4842]: I0311 19:03:30.125649 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-2tbdg" podStartSLOduration=1.677075184 podStartE2EDuration="7.125623031s" podCreationTimestamp="2026-03-11 19:03:23 +0000 UTC" firstStartedPulling="2026-03-11 19:03:23.9860123 +0000 UTC m=+849.633708580" lastFinishedPulling="2026-03-11 19:03:29.434560147 +0000 UTC m=+855.082256427" observedRunningTime="2026-03-11 19:03:30.118521488 +0000 UTC m=+855.766217778" watchObservedRunningTime="2026-03-11 19:03:30.125623031 +0000 UTC m=+855.773319311" Mar 11 19:03:31 crc kubenswrapper[4842]: I0311 19:03:31.112643 4842 generic.go:334] "Generic (PLEG): container finished" podID="44d7713b-d257-4c1e-bf42-52d4134af893" containerID="ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43" exitCode=0 Mar 11 19:03:31 crc kubenswrapper[4842]: I0311 19:03:31.112746 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp8p4" event={"ID":"44d7713b-d257-4c1e-bf42-52d4134af893","Type":"ContainerDied","Data":"ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43"} Mar 11 19:03:32 crc kubenswrapper[4842]: I0311 19:03:32.121702 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp8p4" event={"ID":"44d7713b-d257-4c1e-bf42-52d4134af893","Type":"ContainerStarted","Data":"9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802"} Mar 11 19:03:32 crc kubenswrapper[4842]: I0311 19:03:32.168258 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fp8p4" podStartSLOduration=2.990546807 podStartE2EDuration="5.168231621s" podCreationTimestamp="2026-03-11 19:03:27 +0000 UTC" firstStartedPulling="2026-03-11 19:03:29.384152652 +0000 UTC m=+855.031848932" lastFinishedPulling="2026-03-11 19:03:31.561837466 +0000 UTC m=+857.209533746" observedRunningTime="2026-03-11 19:03:32.145364938 +0000 UTC m=+857.793061238" watchObservedRunningTime="2026-03-11 19:03:32.168231621 +0000 UTC m=+857.815927921" Mar 11 19:03:33 crc kubenswrapper[4842]: I0311 19:03:33.682645 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-nhcr2" Mar 11 19:03:33 crc kubenswrapper[4842]: I0311 19:03:33.910220 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:33 crc kubenswrapper[4842]: I0311 19:03:33.910286 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:33 crc kubenswrapper[4842]: I0311 19:03:33.915222 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:34 crc kubenswrapper[4842]: I0311 19:03:34.138869 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77b59f9678-2s2xc" Mar 11 19:03:34 crc kubenswrapper[4842]: I0311 19:03:34.219438 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v5z72"] Mar 11 19:03:38 crc kubenswrapper[4842]: I0311 19:03:38.275978 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:38 crc kubenswrapper[4842]: I0311 19:03:38.276369 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:38 crc kubenswrapper[4842]: I0311 19:03:38.320823 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:39 crc kubenswrapper[4842]: I0311 19:03:39.232912 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:39 crc kubenswrapper[4842]: I0311 19:03:39.300907 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp8p4"] Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.187936 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fp8p4" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="registry-server" containerID="cri-o://9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802" gracePeriod=2 Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.670351 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.714549 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-catalog-content\") pod \"44d7713b-d257-4c1e-bf42-52d4134af893\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.714650 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d78lz\" (UniqueName: \"kubernetes.io/projected/44d7713b-d257-4c1e-bf42-52d4134af893-kube-api-access-d78lz\") pod \"44d7713b-d257-4c1e-bf42-52d4134af893\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.714735 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-utilities\") pod \"44d7713b-d257-4c1e-bf42-52d4134af893\" (UID: \"44d7713b-d257-4c1e-bf42-52d4134af893\") " Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.716613 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-utilities" (OuterVolumeSpecName: "utilities") pod "44d7713b-d257-4c1e-bf42-52d4134af893" (UID: "44d7713b-d257-4c1e-bf42-52d4134af893"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.727802 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d7713b-d257-4c1e-bf42-52d4134af893-kube-api-access-d78lz" (OuterVolumeSpecName: "kube-api-access-d78lz") pod "44d7713b-d257-4c1e-bf42-52d4134af893" (UID: "44d7713b-d257-4c1e-bf42-52d4134af893"). InnerVolumeSpecName "kube-api-access-d78lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.751415 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44d7713b-d257-4c1e-bf42-52d4134af893" (UID: "44d7713b-d257-4c1e-bf42-52d4134af893"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.817219 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.817315 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d78lz\" (UniqueName: \"kubernetes.io/projected/44d7713b-d257-4c1e-bf42-52d4134af893-kube-api-access-d78lz\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:41 crc kubenswrapper[4842]: I0311 19:03:41.817342 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d7713b-d257-4c1e-bf42-52d4134af893-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.198157 4842 generic.go:334] "Generic (PLEG): container finished" podID="44d7713b-d257-4c1e-bf42-52d4134af893" containerID="9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802" exitCode=0 Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.198243 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp8p4" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.198241 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp8p4" event={"ID":"44d7713b-d257-4c1e-bf42-52d4134af893","Type":"ContainerDied","Data":"9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802"} Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.198437 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp8p4" event={"ID":"44d7713b-d257-4c1e-bf42-52d4134af893","Type":"ContainerDied","Data":"f9e1bc4c812cf6b97e167c05c3de4aa2c7246e77affdc8f24fc999e7318865b7"} Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.198476 4842 scope.go:117] "RemoveContainer" containerID="9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.227723 4842 scope.go:117] "RemoveContainer" containerID="ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.255857 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp8p4"] Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.267844 4842 scope.go:117] "RemoveContainer" containerID="725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.271403 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp8p4"] Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.287720 4842 scope.go:117] "RemoveContainer" containerID="9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802" Mar 11 19:03:42 crc kubenswrapper[4842]: E0311 19:03:42.288316 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802\": container with ID starting with 9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802 not found: ID does not exist" containerID="9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.288358 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802"} err="failed to get container status \"9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802\": rpc error: code = NotFound desc = could not find container \"9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802\": container with ID starting with 9b31613d5e4fcf0118101a24a9a1d1f0c5a0ad3a80df7a12048501b4b3647802 not found: ID does not exist" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.288388 4842 scope.go:117] "RemoveContainer" containerID="ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43" Mar 11 19:03:42 crc kubenswrapper[4842]: E0311 19:03:42.288645 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43\": container with ID starting with ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43 not found: ID does not exist" containerID="ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.288687 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43"} err="failed to get container status \"ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43\": rpc error: code = NotFound desc = could not find container \"ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43\": container with ID starting with ab5c500ee5a4f1d72707d67486b01f11c334edc1b1a480db6b397d34042fdb43 not found: ID does not exist" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.288709 4842 scope.go:117] "RemoveContainer" containerID="725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91" Mar 11 19:03:42 crc kubenswrapper[4842]: E0311 19:03:42.288960 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91\": container with ID starting with 725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91 not found: ID does not exist" containerID="725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.289003 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91"} err="failed to get container status \"725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91\": rpc error: code = NotFound desc = could not find container \"725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91\": container with ID starting with 725b9fc5b6795d4407bbaf57bf16a5172e253b336c85df78cfc1c8cef22aeb91 not found: ID does not exist" Mar 11 19:03:42 crc kubenswrapper[4842]: I0311 19:03:42.973805 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" path="/var/lib/kubelet/pods/44d7713b-d257-4c1e-bf42-52d4134af893/volumes" Mar 11 19:03:43 crc kubenswrapper[4842]: I0311 19:03:43.537841 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rvwdc" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.309869 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk"] Mar 11 19:03:56 crc kubenswrapper[4842]: E0311 19:03:56.310929 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="extract-utilities" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.310981 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="extract-utilities" Mar 11 19:03:56 crc kubenswrapper[4842]: E0311 19:03:56.311000 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="registry-server" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.311014 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="registry-server" Mar 11 19:03:56 crc kubenswrapper[4842]: E0311 19:03:56.311039 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="extract-content" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.311053 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="extract-content" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.311250 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d7713b-d257-4c1e-bf42-52d4134af893" containerName="registry-server" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.312710 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.319765 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.329073 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk"] Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.430611 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.430711 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.430789 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599bc\" (UniqueName: \"kubernetes.io/projected/255a795a-3aab-4e6f-a5b8-4baecc18d798-kube-api-access-599bc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.532237 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.532357 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.532432 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599bc\" (UniqueName: \"kubernetes.io/projected/255a795a-3aab-4e6f-a5b8-4baecc18d798-kube-api-access-599bc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.533615 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.533617 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.570759 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599bc\" (UniqueName: \"kubernetes.io/projected/255a795a-3aab-4e6f-a5b8-4baecc18d798-kube-api-access-599bc\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.629084 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:03:56 crc kubenswrapper[4842]: I0311 19:03:56.907942 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk"] Mar 11 19:03:57 crc kubenswrapper[4842]: I0311 19:03:57.293452 4842 generic.go:334] "Generic (PLEG): container finished" podID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerID="e8c6c2600988fab120a6c50ad0db4681a98f9ce95da23256c7c7f90628da990c" exitCode=0 Mar 11 19:03:57 crc kubenswrapper[4842]: I0311 19:03:57.293492 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" event={"ID":"255a795a-3aab-4e6f-a5b8-4baecc18d798","Type":"ContainerDied","Data":"e8c6c2600988fab120a6c50ad0db4681a98f9ce95da23256c7c7f90628da990c"} Mar 11 19:03:57 crc kubenswrapper[4842]: I0311 19:03:57.293516 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" event={"ID":"255a795a-3aab-4e6f-a5b8-4baecc18d798","Type":"ContainerStarted","Data":"43fce499d728b1af2d796de301a94b2307367fbcd7ab35528defc99a8292e2c9"} Mar 11 19:03:58 crc kubenswrapper[4842]: I0311 19:03:58.855640 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cw7nx"] Mar 11 19:03:58 crc kubenswrapper[4842]: I0311 19:03:58.856888 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:58 crc kubenswrapper[4842]: I0311 19:03:58.880101 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cw7nx"] Mar 11 19:03:58 crc kubenswrapper[4842]: I0311 19:03:58.967691 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjcf\" (UniqueName: \"kubernetes.io/projected/3d724675-eb06-43ad-af44-f08fe0461fc6-kube-api-access-9pjcf\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:58 crc kubenswrapper[4842]: I0311 19:03:58.968031 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-catalog-content\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:58 crc kubenswrapper[4842]: I0311 19:03:58.968142 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-utilities\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.069403 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-catalog-content\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.069457 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-utilities\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.069510 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjcf\" (UniqueName: \"kubernetes.io/projected/3d724675-eb06-43ad-af44-f08fe0461fc6-kube-api-access-9pjcf\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.070062 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-catalog-content\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.070061 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-utilities\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.087944 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjcf\" (UniqueName: \"kubernetes.io/projected/3d724675-eb06-43ad-af44-f08fe0461fc6-kube-api-access-9pjcf\") pod \"redhat-operators-cw7nx\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.176153 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.262093 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-v5z72" podUID="b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" containerName="console" containerID="cri-o://f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a" gracePeriod=15 Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.364403 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cw7nx"] Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.606997 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v5z72_b3061213-bdcd-4ff1-b7bc-ac40a2c01e86/console/0.log" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.607396 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.674902 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-trusted-ca-bundle\") pod \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.674986 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-config\") pod \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675022 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-service-ca\") pod \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675063 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v55jb\" (UniqueName: \"kubernetes.io/projected/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-kube-api-access-v55jb\") pod \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675130 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-oauth-config\") pod \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675208 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-oauth-serving-cert\") pod \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675232 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-serving-cert\") pod \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\" (UID: \"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86\") " Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675652 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" (UID: "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675748 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" (UID: "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.675823 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-service-ca" (OuterVolumeSpecName: "service-ca") pod "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" (UID: "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.676728 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-config" (OuterVolumeSpecName: "console-config") pod "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" (UID: "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.681680 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-kube-api-access-v55jb" (OuterVolumeSpecName: "kube-api-access-v55jb") pod "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" (UID: "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86"). InnerVolumeSpecName "kube-api-access-v55jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.681877 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" (UID: "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.682492 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" (UID: "b3061213-bdcd-4ff1-b7bc-ac40a2c01e86"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.777053 4842 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.777096 4842 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.777107 4842 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.777119 4842 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.777131 4842 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.777143 4842 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 19:03:59 crc kubenswrapper[4842]: I0311 19:03:59.777153 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v55jb\" (UniqueName: \"kubernetes.io/projected/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86-kube-api-access-v55jb\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.148196 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554264-265lh"] Mar 11 19:04:00 crc kubenswrapper[4842]: E0311 19:04:00.148433 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" containerName="console" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.148445 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" containerName="console" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.148539 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" containerName="console" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.148897 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554264-265lh" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.152540 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.153295 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.153641 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.156690 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554264-265lh"] Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.182964 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq6xz\" (UniqueName: \"kubernetes.io/projected/6340e59d-3320-4e33-872c-5e809b37cf69-kube-api-access-cq6xz\") pod \"auto-csr-approver-29554264-265lh\" (UID: \"6340e59d-3320-4e33-872c-5e809b37cf69\") " pod="openshift-infra/auto-csr-approver-29554264-265lh" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.284932 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq6xz\" (UniqueName: \"kubernetes.io/projected/6340e59d-3320-4e33-872c-5e809b37cf69-kube-api-access-cq6xz\") pod \"auto-csr-approver-29554264-265lh\" (UID: \"6340e59d-3320-4e33-872c-5e809b37cf69\") " pod="openshift-infra/auto-csr-approver-29554264-265lh" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.302144 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq6xz\" (UniqueName: \"kubernetes.io/projected/6340e59d-3320-4e33-872c-5e809b37cf69-kube-api-access-cq6xz\") pod \"auto-csr-approver-29554264-265lh\" (UID: \"6340e59d-3320-4e33-872c-5e809b37cf69\") " pod="openshift-infra/auto-csr-approver-29554264-265lh" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.312910 4842 generic.go:334] "Generic (PLEG): container finished" podID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerID="7bd7d6a2672dc11be5757a8ca5b5d75a3de55a1ac5f8f9c91104598db1efbeb2" exitCode=0 Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.312996 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" event={"ID":"255a795a-3aab-4e6f-a5b8-4baecc18d798","Type":"ContainerDied","Data":"7bd7d6a2672dc11be5757a8ca5b5d75a3de55a1ac5f8f9c91104598db1efbeb2"} Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.315674 4842 generic.go:334] "Generic (PLEG): container finished" podID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerID="352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd" exitCode=0 Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.315745 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw7nx" event={"ID":"3d724675-eb06-43ad-af44-f08fe0461fc6","Type":"ContainerDied","Data":"352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd"} Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.315768 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw7nx" event={"ID":"3d724675-eb06-43ad-af44-f08fe0461fc6","Type":"ContainerStarted","Data":"b3d872b6322ecb5dcf38b84aadd649a8fc002ac0beaeb7939dfb2b41b928701c"} Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.317629 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v5z72_b3061213-bdcd-4ff1-b7bc-ac40a2c01e86/console/0.log" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.317706 4842 generic.go:334] "Generic (PLEG): container finished" podID="b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" containerID="f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a" exitCode=2 Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.317741 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5z72" event={"ID":"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86","Type":"ContainerDied","Data":"f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a"} Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.317750 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v5z72" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.317778 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v5z72" event={"ID":"b3061213-bdcd-4ff1-b7bc-ac40a2c01e86","Type":"ContainerDied","Data":"8e449753f2f70334d54ab85a1f093c5f791bc358184e45de0e50cc9543e40998"} Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.317803 4842 scope.go:117] "RemoveContainer" containerID="f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.347231 4842 scope.go:117] "RemoveContainer" containerID="f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a" Mar 11 19:04:00 crc kubenswrapper[4842]: E0311 19:04:00.347764 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a\": container with ID starting with f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a not found: ID does not exist" containerID="f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.347800 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a"} err="failed to get container status \"f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a\": rpc error: code = NotFound desc = could not find container \"f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a\": container with ID starting with f146ef0c9530ebf0763fa0bf05a85d3dffbcbc2c042896c89249285875f4bc1a not found: ID does not exist" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.383793 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v5z72"] Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.388456 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-v5z72"] Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.463762 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554264-265lh" Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.913082 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554264-265lh"] Mar 11 19:04:00 crc kubenswrapper[4842]: W0311 19:04:00.922498 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6340e59d_3320_4e33_872c_5e809b37cf69.slice/crio-93aae693dd166bdbac441f1703009d67b060095834da324fcd9516145b17f95c WatchSource:0}: Error finding container 93aae693dd166bdbac441f1703009d67b060095834da324fcd9516145b17f95c: Status 404 returned error can't find the container with id 93aae693dd166bdbac441f1703009d67b060095834da324fcd9516145b17f95c Mar 11 19:04:00 crc kubenswrapper[4842]: I0311 19:04:00.970305 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3061213-bdcd-4ff1-b7bc-ac40a2c01e86" path="/var/lib/kubelet/pods/b3061213-bdcd-4ff1-b7bc-ac40a2c01e86/volumes" Mar 11 19:04:01 crc kubenswrapper[4842]: I0311 19:04:01.331533 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554264-265lh" event={"ID":"6340e59d-3320-4e33-872c-5e809b37cf69","Type":"ContainerStarted","Data":"93aae693dd166bdbac441f1703009d67b060095834da324fcd9516145b17f95c"} Mar 11 19:04:01 crc kubenswrapper[4842]: I0311 19:04:01.336962 4842 generic.go:334] "Generic (PLEG): container finished" podID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerID="67e4ec711339dd96d4d20d616586712c82baae42aad2f18b0e5cfc3d1f924e55" exitCode=0 Mar 11 19:04:01 crc kubenswrapper[4842]: I0311 19:04:01.336988 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" event={"ID":"255a795a-3aab-4e6f-a5b8-4baecc18d798","Type":"ContainerDied","Data":"67e4ec711339dd96d4d20d616586712c82baae42aad2f18b0e5cfc3d1f924e55"} Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.344113 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw7nx" event={"ID":"3d724675-eb06-43ad-af44-f08fe0461fc6","Type":"ContainerStarted","Data":"9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed"} Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.639720 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.716990 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-bundle\") pod \"255a795a-3aab-4e6f-a5b8-4baecc18d798\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.717137 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-599bc\" (UniqueName: \"kubernetes.io/projected/255a795a-3aab-4e6f-a5b8-4baecc18d798-kube-api-access-599bc\") pod \"255a795a-3aab-4e6f-a5b8-4baecc18d798\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.717168 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-util\") pod \"255a795a-3aab-4e6f-a5b8-4baecc18d798\" (UID: \"255a795a-3aab-4e6f-a5b8-4baecc18d798\") " Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.717890 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-bundle" (OuterVolumeSpecName: "bundle") pod "255a795a-3aab-4e6f-a5b8-4baecc18d798" (UID: "255a795a-3aab-4e6f-a5b8-4baecc18d798"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.729331 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-util" (OuterVolumeSpecName: "util") pod "255a795a-3aab-4e6f-a5b8-4baecc18d798" (UID: "255a795a-3aab-4e6f-a5b8-4baecc18d798"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.739700 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255a795a-3aab-4e6f-a5b8-4baecc18d798-kube-api-access-599bc" (OuterVolumeSpecName: "kube-api-access-599bc") pod "255a795a-3aab-4e6f-a5b8-4baecc18d798" (UID: "255a795a-3aab-4e6f-a5b8-4baecc18d798"). InnerVolumeSpecName "kube-api-access-599bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.818868 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-599bc\" (UniqueName: \"kubernetes.io/projected/255a795a-3aab-4e6f-a5b8-4baecc18d798-kube-api-access-599bc\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.818901 4842 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-util\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:02 crc kubenswrapper[4842]: I0311 19:04:02.818913 4842 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/255a795a-3aab-4e6f-a5b8-4baecc18d798-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:03 crc kubenswrapper[4842]: I0311 19:04:03.351809 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" event={"ID":"255a795a-3aab-4e6f-a5b8-4baecc18d798","Type":"ContainerDied","Data":"43fce499d728b1af2d796de301a94b2307367fbcd7ab35528defc99a8292e2c9"} Mar 11 19:04:03 crc kubenswrapper[4842]: I0311 19:04:03.352228 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43fce499d728b1af2d796de301a94b2307367fbcd7ab35528defc99a8292e2c9" Mar 11 19:04:03 crc kubenswrapper[4842]: I0311 19:04:03.352092 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk" Mar 11 19:04:03 crc kubenswrapper[4842]: I0311 19:04:03.355057 4842 generic.go:334] "Generic (PLEG): container finished" podID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerID="9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed" exitCode=0 Mar 11 19:04:03 crc kubenswrapper[4842]: I0311 19:04:03.355117 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw7nx" event={"ID":"3d724675-eb06-43ad-af44-f08fe0461fc6","Type":"ContainerDied","Data":"9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed"} Mar 11 19:04:03 crc kubenswrapper[4842]: I0311 19:04:03.356971 4842 generic.go:334] "Generic (PLEG): container finished" podID="6340e59d-3320-4e33-872c-5e809b37cf69" containerID="f440bca9948b779e9200ccc88fc942abc9fa18ee2907d6784a57858fafbc21f2" exitCode=0 Mar 11 19:04:03 crc kubenswrapper[4842]: I0311 19:04:03.357011 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554264-265lh" event={"ID":"6340e59d-3320-4e33-872c-5e809b37cf69","Type":"ContainerDied","Data":"f440bca9948b779e9200ccc88fc942abc9fa18ee2907d6784a57858fafbc21f2"} Mar 11 19:04:04 crc kubenswrapper[4842]: I0311 19:04:04.622129 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554264-265lh" Mar 11 19:04:04 crc kubenswrapper[4842]: I0311 19:04:04.645087 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq6xz\" (UniqueName: \"kubernetes.io/projected/6340e59d-3320-4e33-872c-5e809b37cf69-kube-api-access-cq6xz\") pod \"6340e59d-3320-4e33-872c-5e809b37cf69\" (UID: \"6340e59d-3320-4e33-872c-5e809b37cf69\") " Mar 11 19:04:04 crc kubenswrapper[4842]: I0311 19:04:04.652508 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6340e59d-3320-4e33-872c-5e809b37cf69-kube-api-access-cq6xz" (OuterVolumeSpecName: "kube-api-access-cq6xz") pod "6340e59d-3320-4e33-872c-5e809b37cf69" (UID: "6340e59d-3320-4e33-872c-5e809b37cf69"). InnerVolumeSpecName "kube-api-access-cq6xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:04:04 crc kubenswrapper[4842]: I0311 19:04:04.746219 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq6xz\" (UniqueName: \"kubernetes.io/projected/6340e59d-3320-4e33-872c-5e809b37cf69-kube-api-access-cq6xz\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:05 crc kubenswrapper[4842]: I0311 19:04:05.371702 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554264-265lh" Mar 11 19:04:05 crc kubenswrapper[4842]: I0311 19:04:05.371696 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554264-265lh" event={"ID":"6340e59d-3320-4e33-872c-5e809b37cf69","Type":"ContainerDied","Data":"93aae693dd166bdbac441f1703009d67b060095834da324fcd9516145b17f95c"} Mar 11 19:04:05 crc kubenswrapper[4842]: I0311 19:04:05.372074 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93aae693dd166bdbac441f1703009d67b060095834da324fcd9516145b17f95c" Mar 11 19:04:05 crc kubenswrapper[4842]: I0311 19:04:05.377610 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw7nx" event={"ID":"3d724675-eb06-43ad-af44-f08fe0461fc6","Type":"ContainerStarted","Data":"639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0"} Mar 11 19:04:05 crc kubenswrapper[4842]: I0311 19:04:05.404522 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cw7nx" podStartSLOduration=2.952225599 podStartE2EDuration="7.404500235s" podCreationTimestamp="2026-03-11 19:03:58 +0000 UTC" firstStartedPulling="2026-03-11 19:04:00.3170703 +0000 UTC m=+885.964766580" lastFinishedPulling="2026-03-11 19:04:04.769344926 +0000 UTC m=+890.417041216" observedRunningTime="2026-03-11 19:04:05.399572841 +0000 UTC m=+891.047269151" watchObservedRunningTime="2026-03-11 19:04:05.404500235 +0000 UTC m=+891.052196535" Mar 11 19:04:05 crc kubenswrapper[4842]: I0311 19:04:05.677199 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554258-vbkkd"] Mar 11 19:04:05 crc kubenswrapper[4842]: I0311 19:04:05.679953 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554258-vbkkd"] Mar 11 19:04:06 crc kubenswrapper[4842]: I0311 19:04:06.972780 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f246fb1-5e30-44fc-b041-8474f40b3936" path="/var/lib/kubelet/pods/5f246fb1-5e30-44fc-b041-8474f40b3936/volumes" Mar 11 19:04:09 crc kubenswrapper[4842]: I0311 19:04:09.177121 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:04:09 crc kubenswrapper[4842]: I0311 19:04:09.177615 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:04:10 crc kubenswrapper[4842]: I0311 19:04:10.224826 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cw7nx" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="registry-server" probeResult="failure" output=< Mar 11 19:04:10 crc kubenswrapper[4842]: timeout: failed to connect service ":50051" within 1s Mar 11 19:04:10 crc kubenswrapper[4842]: > Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.788440 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5644775d48-l26mz"] Mar 11 19:04:13 crc kubenswrapper[4842]: E0311 19:04:13.788907 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerName="extract" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.788920 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerName="extract" Mar 11 19:04:13 crc kubenswrapper[4842]: E0311 19:04:13.788934 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerName="util" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.788940 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerName="util" Mar 11 19:04:13 crc kubenswrapper[4842]: E0311 19:04:13.788953 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6340e59d-3320-4e33-872c-5e809b37cf69" containerName="oc" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.788960 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6340e59d-3320-4e33-872c-5e809b37cf69" containerName="oc" Mar 11 19:04:13 crc kubenswrapper[4842]: E0311 19:04:13.788973 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerName="pull" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.788979 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerName="pull" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.789067 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="255a795a-3aab-4e6f-a5b8-4baecc18d798" containerName="extract" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.789076 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6340e59d-3320-4e33-872c-5e809b37cf69" containerName="oc" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.789440 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.792011 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ql4n6" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.792077 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.792239 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.792966 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.793230 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.846335 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5644775d48-l26mz"] Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.897407 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7922e71f-79aa-41c2-81c5-539d767c4d0e-webhook-cert\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.897446 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7dh\" (UniqueName: \"kubernetes.io/projected/7922e71f-79aa-41c2-81c5-539d767c4d0e-kube-api-access-qx7dh\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.897474 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7922e71f-79aa-41c2-81c5-539d767c4d0e-apiservice-cert\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.998322 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7922e71f-79aa-41c2-81c5-539d767c4d0e-webhook-cert\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.998534 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7dh\" (UniqueName: \"kubernetes.io/projected/7922e71f-79aa-41c2-81c5-539d767c4d0e-kube-api-access-qx7dh\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:13 crc kubenswrapper[4842]: I0311 19:04:13.998632 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7922e71f-79aa-41c2-81c5-539d767c4d0e-apiservice-cert\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.004311 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7922e71f-79aa-41c2-81c5-539d767c4d0e-apiservice-cert\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.004777 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7922e71f-79aa-41c2-81c5-539d767c4d0e-webhook-cert\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.015984 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7dh\" (UniqueName: \"kubernetes.io/projected/7922e71f-79aa-41c2-81c5-539d767c4d0e-kube-api-access-qx7dh\") pod \"metallb-operator-controller-manager-5644775d48-l26mz\" (UID: \"7922e71f-79aa-41c2-81c5-539d767c4d0e\") " pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.108053 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.117698 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc"] Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.118545 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.122056 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.122070 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.122475 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2zzzx" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.137974 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc"] Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.306100 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2d79a37-4861-45b1-b7fd-7f489497cacf-apiservice-cert\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.307667 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cvm8\" (UniqueName: \"kubernetes.io/projected/a2d79a37-4861-45b1-b7fd-7f489497cacf-kube-api-access-2cvm8\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.307927 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2d79a37-4861-45b1-b7fd-7f489497cacf-webhook-cert\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.370239 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5644775d48-l26mz"] Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.412225 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2d79a37-4861-45b1-b7fd-7f489497cacf-webhook-cert\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.412308 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2d79a37-4861-45b1-b7fd-7f489497cacf-apiservice-cert\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.412350 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cvm8\" (UniqueName: \"kubernetes.io/projected/a2d79a37-4861-45b1-b7fd-7f489497cacf-kube-api-access-2cvm8\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.419905 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2d79a37-4861-45b1-b7fd-7f489497cacf-apiservice-cert\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.424974 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2d79a37-4861-45b1-b7fd-7f489497cacf-webhook-cert\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.445736 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cvm8\" (UniqueName: \"kubernetes.io/projected/a2d79a37-4861-45b1-b7fd-7f489497cacf-kube-api-access-2cvm8\") pod \"metallb-operator-webhook-server-7856df687f-n2vkc\" (UID: \"a2d79a37-4861-45b1-b7fd-7f489497cacf\") " pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.480333 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.513152 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" event={"ID":"7922e71f-79aa-41c2-81c5-539d767c4d0e","Type":"ContainerStarted","Data":"511f8eacc89c10fb1555e2bcba79b60459739a49e698d3b1e479ffc2c824d782"} Mar 11 19:04:14 crc kubenswrapper[4842]: I0311 19:04:14.683503 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc"] Mar 11 19:04:14 crc kubenswrapper[4842]: W0311 19:04:14.691231 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d79a37_4861_45b1_b7fd_7f489497cacf.slice/crio-ff205f3a6aaba10d3d91b0ee05c54a84e5c6d2752253f2fb5e97ebdc35ad73e6 WatchSource:0}: Error finding container ff205f3a6aaba10d3d91b0ee05c54a84e5c6d2752253f2fb5e97ebdc35ad73e6: Status 404 returned error can't find the container with id ff205f3a6aaba10d3d91b0ee05c54a84e5c6d2752253f2fb5e97ebdc35ad73e6 Mar 11 19:04:15 crc kubenswrapper[4842]: I0311 19:04:15.520361 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" event={"ID":"a2d79a37-4861-45b1-b7fd-7f489497cacf","Type":"ContainerStarted","Data":"ff205f3a6aaba10d3d91b0ee05c54a84e5c6d2752253f2fb5e97ebdc35ad73e6"} Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.222012 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.281922 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.560504 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" event={"ID":"a2d79a37-4861-45b1-b7fd-7f489497cacf","Type":"ContainerStarted","Data":"61e540890fed8301e25996119160797b2bb481f91ff162124796ce6b098a1434"} Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.560581 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.562482 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" event={"ID":"7922e71f-79aa-41c2-81c5-539d767c4d0e","Type":"ContainerStarted","Data":"89c47c89d8b1befc9fb2e545aef94c68a1eee00c61a8988c21ddd2e9802db3af"} Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.562760 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.580453 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" podStartSLOduration=0.852828182 podStartE2EDuration="5.58043295s" podCreationTimestamp="2026-03-11 19:04:14 +0000 UTC" firstStartedPulling="2026-03-11 19:04:14.694493952 +0000 UTC m=+900.342190232" lastFinishedPulling="2026-03-11 19:04:19.42209872 +0000 UTC m=+905.069795000" observedRunningTime="2026-03-11 19:04:19.576577165 +0000 UTC m=+905.224273445" watchObservedRunningTime="2026-03-11 19:04:19.58043295 +0000 UTC m=+905.228129230" Mar 11 19:04:19 crc kubenswrapper[4842]: I0311 19:04:19.608081 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" podStartSLOduration=3.082123132 podStartE2EDuration="6.608064834s" podCreationTimestamp="2026-03-11 19:04:13 +0000 UTC" firstStartedPulling="2026-03-11 19:04:14.384801292 +0000 UTC m=+900.032497572" lastFinishedPulling="2026-03-11 19:04:17.910742994 +0000 UTC m=+903.558439274" observedRunningTime="2026-03-11 19:04:19.603751626 +0000 UTC m=+905.251447906" watchObservedRunningTime="2026-03-11 19:04:19.608064834 +0000 UTC m=+905.255761114" Mar 11 19:04:21 crc kubenswrapper[4842]: I0311 19:04:21.643000 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cw7nx"] Mar 11 19:04:21 crc kubenswrapper[4842]: I0311 19:04:21.644306 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cw7nx" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="registry-server" containerID="cri-o://639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0" gracePeriod=2 Mar 11 19:04:21 crc kubenswrapper[4842]: I0311 19:04:21.992810 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.125557 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-utilities\") pod \"3d724675-eb06-43ad-af44-f08fe0461fc6\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.125632 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pjcf\" (UniqueName: \"kubernetes.io/projected/3d724675-eb06-43ad-af44-f08fe0461fc6-kube-api-access-9pjcf\") pod \"3d724675-eb06-43ad-af44-f08fe0461fc6\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.125734 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-catalog-content\") pod \"3d724675-eb06-43ad-af44-f08fe0461fc6\" (UID: \"3d724675-eb06-43ad-af44-f08fe0461fc6\") " Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.127192 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-utilities" (OuterVolumeSpecName: "utilities") pod "3d724675-eb06-43ad-af44-f08fe0461fc6" (UID: "3d724675-eb06-43ad-af44-f08fe0461fc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.130831 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d724675-eb06-43ad-af44-f08fe0461fc6-kube-api-access-9pjcf" (OuterVolumeSpecName: "kube-api-access-9pjcf") pod "3d724675-eb06-43ad-af44-f08fe0461fc6" (UID: "3d724675-eb06-43ad-af44-f08fe0461fc6"). InnerVolumeSpecName "kube-api-access-9pjcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.227741 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.227803 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pjcf\" (UniqueName: \"kubernetes.io/projected/3d724675-eb06-43ad-af44-f08fe0461fc6-kube-api-access-9pjcf\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.259194 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d724675-eb06-43ad-af44-f08fe0461fc6" (UID: "3d724675-eb06-43ad-af44-f08fe0461fc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.329349 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d724675-eb06-43ad-af44-f08fe0461fc6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.592473 4842 generic.go:334] "Generic (PLEG): container finished" podID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerID="639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0" exitCode=0 Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.592532 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw7nx" event={"ID":"3d724675-eb06-43ad-af44-f08fe0461fc6","Type":"ContainerDied","Data":"639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0"} Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.592564 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cw7nx" event={"ID":"3d724675-eb06-43ad-af44-f08fe0461fc6","Type":"ContainerDied","Data":"b3d872b6322ecb5dcf38b84aadd649a8fc002ac0beaeb7939dfb2b41b928701c"} Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.592586 4842 scope.go:117] "RemoveContainer" containerID="639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.592536 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cw7nx" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.614804 4842 scope.go:117] "RemoveContainer" containerID="9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.632018 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cw7nx"] Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.633312 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cw7nx"] Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.643697 4842 scope.go:117] "RemoveContainer" containerID="352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.656338 4842 scope.go:117] "RemoveContainer" containerID="639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0" Mar 11 19:04:22 crc kubenswrapper[4842]: E0311 19:04:22.656806 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0\": container with ID starting with 639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0 not found: ID does not exist" containerID="639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.656836 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0"} err="failed to get container status \"639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0\": rpc error: code = NotFound desc = could not find container \"639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0\": container with ID starting with 639076ff0f6550081095e1b7a69210d474fe35de01d44f4ce0372a31c47c54c0 not found: ID does not exist" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.656862 4842 scope.go:117] "RemoveContainer" containerID="9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed" Mar 11 19:04:22 crc kubenswrapper[4842]: E0311 19:04:22.657122 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed\": container with ID starting with 9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed not found: ID does not exist" containerID="9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.657143 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed"} err="failed to get container status \"9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed\": rpc error: code = NotFound desc = could not find container \"9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed\": container with ID starting with 9562412c4ef7fe6b942a6a108cb752699b366407ad6e84f02229288cdf8923ed not found: ID does not exist" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.657159 4842 scope.go:117] "RemoveContainer" containerID="352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd" Mar 11 19:04:22 crc kubenswrapper[4842]: E0311 19:04:22.657385 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd\": container with ID starting with 352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd not found: ID does not exist" containerID="352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.657407 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd"} err="failed to get container status \"352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd\": rpc error: code = NotFound desc = could not find container \"352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd\": container with ID starting with 352e4bc46f71b20324ad27e22d4a0c64cee02669015d81b924f614d56f946bfd not found: ID does not exist" Mar 11 19:04:22 crc kubenswrapper[4842]: I0311 19:04:22.972486 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" path="/var/lib/kubelet/pods/3d724675-eb06-43ad-af44-f08fe0461fc6/volumes" Mar 11 19:04:23 crc kubenswrapper[4842]: I0311 19:04:23.951888 4842 scope.go:117] "RemoveContainer" containerID="7077e3ce76b911406b7ad5c2cb13c14ae14dab9da0067145247d697f3a09100f" Mar 11 19:04:34 crc kubenswrapper[4842]: I0311 19:04:34.484871 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7856df687f-n2vkc" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.110694 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5644775d48-l26mz" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.883037 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf"] Mar 11 19:04:54 crc kubenswrapper[4842]: E0311 19:04:54.883351 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="registry-server" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.883373 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="registry-server" Mar 11 19:04:54 crc kubenswrapper[4842]: E0311 19:04:54.883389 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="extract-content" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.883397 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="extract-content" Mar 11 19:04:54 crc kubenswrapper[4842]: E0311 19:04:54.883411 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="extract-utilities" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.883419 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="extract-utilities" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.883542 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d724675-eb06-43ad-af44-f08fe0461fc6" containerName="registry-server" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.884002 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.887099 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.888660 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xhxtd" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.898925 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hkvd5"] Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.925492 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.930498 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf"] Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.931402 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.933720 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967220 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-sockets\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967272 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967311 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics-certs\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967337 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql68q\" (UniqueName: \"kubernetes.io/projected/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-kube-api-access-ql68q\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967360 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eca2d6af-43c5-40c2-9589-20e998cdd092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kmmqf\" (UID: \"eca2d6af-43c5-40c2-9589-20e998cdd092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967380 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-conf\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967410 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-reloader\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967438 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-startup\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.967478 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhqjr\" (UniqueName: \"kubernetes.io/projected/eca2d6af-43c5-40c2-9589-20e998cdd092-kube-api-access-mhqjr\") pod \"frr-k8s-webhook-server-bcc4b6f68-kmmqf\" (UID: \"eca2d6af-43c5-40c2-9589-20e998cdd092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.970731 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8h7pw"] Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.971870 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8h7pw" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.978541 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.978687 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-8n74r" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.978791 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.980886 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.992891 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-qgmvl"] Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.994057 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:54 crc kubenswrapper[4842]: I0311 19:04:54.995811 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.005490 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qgmvl"] Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.068825 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics-certs\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.068873 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqx2v\" (UniqueName: \"kubernetes.io/projected/24fd244e-dd50-4270-ad2c-950f5b3f7483-kube-api-access-kqx2v\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.068902 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql68q\" (UniqueName: \"kubernetes.io/projected/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-kube-api-access-ql68q\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.068930 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eca2d6af-43c5-40c2-9589-20e998cdd092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kmmqf\" (UID: \"eca2d6af-43c5-40c2-9589-20e998cdd092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.068950 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-conf\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.068991 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-reloader\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.068996 4842 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069022 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24fd244e-dd50-4270-ad2c-950f5b3f7483-metallb-excludel2\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069047 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-startup\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069080 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-cert\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.069111 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics-certs podName:72b5889e-3eab-414e-ad5d-f6a74b2ec5fe nodeName:}" failed. No retries permitted until 2026-03-11 19:04:55.569088929 +0000 UTC m=+941.216785279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics-certs") pod "frr-k8s-hkvd5" (UID: "72b5889e-3eab-414e-ad5d-f6a74b2ec5fe") : secret "frr-k8s-certs-secret" not found Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069133 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-metrics-certs\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069186 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhqjr\" (UniqueName: \"kubernetes.io/projected/eca2d6af-43c5-40c2-9589-20e998cdd092-kube-api-access-mhqjr\") pod \"frr-k8s-webhook-server-bcc4b6f68-kmmqf\" (UID: \"eca2d6af-43c5-40c2-9589-20e998cdd092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069224 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069257 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-metrics-certs\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069294 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnwj2\" (UniqueName: \"kubernetes.io/projected/327901ef-4482-4254-be1f-daa388e6a1f2-kube-api-access-lnwj2\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069327 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-sockets\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069348 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.069548 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-conf\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.070219 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.070373 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-reloader\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.070444 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-sockets\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.070766 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-frr-startup\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.074593 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eca2d6af-43c5-40c2-9589-20e998cdd092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kmmqf\" (UID: \"eca2d6af-43c5-40c2-9589-20e998cdd092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.084140 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql68q\" (UniqueName: \"kubernetes.io/projected/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-kube-api-access-ql68q\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.085929 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhqjr\" (UniqueName: \"kubernetes.io/projected/eca2d6af-43c5-40c2-9589-20e998cdd092-kube-api-access-mhqjr\") pod \"frr-k8s-webhook-server-bcc4b6f68-kmmqf\" (UID: \"eca2d6af-43c5-40c2-9589-20e998cdd092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.169957 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-metrics-certs\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.169997 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnwj2\" (UniqueName: \"kubernetes.io/projected/327901ef-4482-4254-be1f-daa388e6a1f2-kube-api-access-lnwj2\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.170037 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqx2v\" (UniqueName: \"kubernetes.io/projected/24fd244e-dd50-4270-ad2c-950f5b3f7483-kube-api-access-kqx2v\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.170070 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24fd244e-dd50-4270-ad2c-950f5b3f7483-metallb-excludel2\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.170092 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-cert\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.170112 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-metrics-certs\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.170142 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.170156 4842 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.170241 4842 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.170255 4842 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.170261 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-metrics-certs podName:327901ef-4482-4254-be1f-daa388e6a1f2 nodeName:}" failed. No retries permitted until 2026-03-11 19:04:55.670236659 +0000 UTC m=+941.317932959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-metrics-certs") pod "controller-7bb4cc7c98-qgmvl" (UID: "327901ef-4482-4254-be1f-daa388e6a1f2") : secret "controller-certs-secret" not found Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.170377 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist podName:24fd244e-dd50-4270-ad2c-950f5b3f7483 nodeName:}" failed. No retries permitted until 2026-03-11 19:04:55.670363112 +0000 UTC m=+941.318059392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist") pod "speaker-8h7pw" (UID: "24fd244e-dd50-4270-ad2c-950f5b3f7483") : secret "metallb-memberlist" not found Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.170393 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-metrics-certs podName:24fd244e-dd50-4270-ad2c-950f5b3f7483 nodeName:}" failed. No retries permitted until 2026-03-11 19:04:55.670385583 +0000 UTC m=+941.318081863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-metrics-certs") pod "speaker-8h7pw" (UID: "24fd244e-dd50-4270-ad2c-950f5b3f7483") : secret "speaker-certs-secret" not found Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.170837 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24fd244e-dd50-4270-ad2c-950f5b3f7483-metallb-excludel2\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.172744 4842 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.187821 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-cert\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.190352 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqx2v\" (UniqueName: \"kubernetes.io/projected/24fd244e-dd50-4270-ad2c-950f5b3f7483-kube-api-access-kqx2v\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.192171 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnwj2\" (UniqueName: \"kubernetes.io/projected/327901ef-4482-4254-be1f-daa388e6a1f2-kube-api-access-lnwj2\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.227036 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.477002 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf"] Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.574762 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics-certs\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.577671 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/72b5889e-3eab-414e-ad5d-f6a74b2ec5fe-metrics-certs\") pod \"frr-k8s-hkvd5\" (UID: \"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe\") " pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.676108 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-metrics-certs\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.676206 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.676245 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-metrics-certs\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.676452 4842 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 19:04:55 crc kubenswrapper[4842]: E0311 19:04:55.676526 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist podName:24fd244e-dd50-4270-ad2c-950f5b3f7483 nodeName:}" failed. No retries permitted until 2026-03-11 19:04:56.676505842 +0000 UTC m=+942.324202122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist") pod "speaker-8h7pw" (UID: "24fd244e-dd50-4270-ad2c-950f5b3f7483") : secret "metallb-memberlist" not found Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.682655 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327901ef-4482-4254-be1f-daa388e6a1f2-metrics-certs\") pod \"controller-7bb4cc7c98-qgmvl\" (UID: \"327901ef-4482-4254-be1f-daa388e6a1f2\") " pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.686236 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-metrics-certs\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.778642 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" event={"ID":"eca2d6af-43c5-40c2-9589-20e998cdd092","Type":"ContainerStarted","Data":"97cdf10d73fe1a130948df765a87fcff8c034c0fa8dd598a9f777ea5ca2055c7"} Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.841679 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:04:55 crc kubenswrapper[4842]: I0311 19:04:55.908506 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.281684 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qgmvl"] Mar 11 19:04:56 crc kubenswrapper[4842]: W0311 19:04:56.289738 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod327901ef_4482_4254_be1f_daa388e6a1f2.slice/crio-467ea546f0f8c91a9d2b649d02b61e7bd4e2e19c460f25e4bab284caec07c631 WatchSource:0}: Error finding container 467ea546f0f8c91a9d2b649d02b61e7bd4e2e19c460f25e4bab284caec07c631: Status 404 returned error can't find the container with id 467ea546f0f8c91a9d2b649d02b61e7bd4e2e19c460f25e4bab284caec07c631 Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.689012 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.708447 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24fd244e-dd50-4270-ad2c-950f5b3f7483-memberlist\") pod \"speaker-8h7pw\" (UID: \"24fd244e-dd50-4270-ad2c-950f5b3f7483\") " pod="metallb-system/speaker-8h7pw" Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.793080 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8h7pw" Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.794611 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qgmvl" event={"ID":"327901ef-4482-4254-be1f-daa388e6a1f2","Type":"ContainerStarted","Data":"47943bfe0819839e59cb394849c0c75d76d20aa7a9a8fdcf9b645edb94bf7d4a"} Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.794671 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qgmvl" event={"ID":"327901ef-4482-4254-be1f-daa388e6a1f2","Type":"ContainerStarted","Data":"bf946ec8e029a5344a9f5a8e168ed1f107683d5b7a392dd8175d2cb7af3ad48d"} Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.794689 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qgmvl" event={"ID":"327901ef-4482-4254-be1f-daa388e6a1f2","Type":"ContainerStarted","Data":"467ea546f0f8c91a9d2b649d02b61e7bd4e2e19c460f25e4bab284caec07c631"} Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.795553 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.804907 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerStarted","Data":"0e9367636b22c5dcb6ac8d6340502918193b38d26677526bbab18736be89e435"} Mar 11 19:04:56 crc kubenswrapper[4842]: I0311 19:04:56.831881 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-qgmvl" podStartSLOduration=2.831858514 podStartE2EDuration="2.831858514s" podCreationTimestamp="2026-03-11 19:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:04:56.82583581 +0000 UTC m=+942.473532110" watchObservedRunningTime="2026-03-11 19:04:56.831858514 +0000 UTC m=+942.479554794" Mar 11 19:04:57 crc kubenswrapper[4842]: I0311 19:04:57.823361 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h7pw" event={"ID":"24fd244e-dd50-4270-ad2c-950f5b3f7483","Type":"ContainerStarted","Data":"8bc374faa7d5f3967344cf2214c9a1485473a21bf61b70fde4dc3cb038078cc3"} Mar 11 19:04:57 crc kubenswrapper[4842]: I0311 19:04:57.823726 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h7pw" event={"ID":"24fd244e-dd50-4270-ad2c-950f5b3f7483","Type":"ContainerStarted","Data":"1a3ba99498f149603201840bdab1527fb8e09b3d2191d1f3a74be7d2062d5b0d"} Mar 11 19:04:57 crc kubenswrapper[4842]: I0311 19:04:57.823738 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h7pw" event={"ID":"24fd244e-dd50-4270-ad2c-950f5b3f7483","Type":"ContainerStarted","Data":"bdeff64f36d57776bc73cae7372a4f1bf844567d11f754db4a60dfb0e426426f"} Mar 11 19:04:57 crc kubenswrapper[4842]: I0311 19:04:57.823993 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8h7pw" Mar 11 19:04:57 crc kubenswrapper[4842]: I0311 19:04:57.840821 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8h7pw" podStartSLOduration=3.8408029519999998 podStartE2EDuration="3.840802952s" podCreationTimestamp="2026-03-11 19:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:04:57.839651711 +0000 UTC m=+943.487348011" watchObservedRunningTime="2026-03-11 19:04:57.840802952 +0000 UTC m=+943.488499232" Mar 11 19:05:01 crc kubenswrapper[4842]: I0311 19:05:01.471347 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:05:01 crc kubenswrapper[4842]: I0311 19:05:01.471656 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:05:02 crc kubenswrapper[4842]: I0311 19:05:02.887929 4842 generic.go:334] "Generic (PLEG): container finished" podID="72b5889e-3eab-414e-ad5d-f6a74b2ec5fe" containerID="da5a97622279faae27c00722515d9201e7add456b6a8a579b38d19b5c856374f" exitCode=0 Mar 11 19:05:02 crc kubenswrapper[4842]: I0311 19:05:02.887998 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerDied","Data":"da5a97622279faae27c00722515d9201e7add456b6a8a579b38d19b5c856374f"} Mar 11 19:05:02 crc kubenswrapper[4842]: I0311 19:05:02.890011 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" event={"ID":"eca2d6af-43c5-40c2-9589-20e998cdd092","Type":"ContainerStarted","Data":"bc73ddcd5f9bc94510e937c7901ee517db0186a8c2b58150b2e93efe1d12e961"} Mar 11 19:05:02 crc kubenswrapper[4842]: I0311 19:05:02.890134 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:05:02 crc kubenswrapper[4842]: I0311 19:05:02.943781 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" podStartSLOduration=1.889348299 podStartE2EDuration="8.943760721s" podCreationTimestamp="2026-03-11 19:04:54 +0000 UTC" firstStartedPulling="2026-03-11 19:04:55.486324393 +0000 UTC m=+941.134020673" lastFinishedPulling="2026-03-11 19:05:02.540736805 +0000 UTC m=+948.188433095" observedRunningTime="2026-03-11 19:05:02.939733042 +0000 UTC m=+948.587429322" watchObservedRunningTime="2026-03-11 19:05:02.943760721 +0000 UTC m=+948.591457011" Mar 11 19:05:03 crc kubenswrapper[4842]: I0311 19:05:03.900008 4842 generic.go:334] "Generic (PLEG): container finished" podID="72b5889e-3eab-414e-ad5d-f6a74b2ec5fe" containerID="c5891a151eac74140fff14874cb18521213daff5f9f30132fb7c17919025e270" exitCode=0 Mar 11 19:05:03 crc kubenswrapper[4842]: I0311 19:05:03.900268 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerDied","Data":"c5891a151eac74140fff14874cb18521213daff5f9f30132fb7c17919025e270"} Mar 11 19:05:04 crc kubenswrapper[4842]: I0311 19:05:04.909177 4842 generic.go:334] "Generic (PLEG): container finished" podID="72b5889e-3eab-414e-ad5d-f6a74b2ec5fe" containerID="1f2edcb33b0d168f04ab29a30101b794cf66fb2f445146279d0657184d621565" exitCode=0 Mar 11 19:05:04 crc kubenswrapper[4842]: I0311 19:05:04.909289 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerDied","Data":"1f2edcb33b0d168f04ab29a30101b794cf66fb2f445146279d0657184d621565"} Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.917792 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerStarted","Data":"6a0e9dd1f0f47bdec8ca0b225fa4cb576a6ec97fed234cdd9181e928bad7eff3"} Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.918681 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.918745 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerStarted","Data":"3c4f73843fdaf86a215f134d5abb7638a4fd233106b29d96ba9f4c1b8cd93e46"} Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.918809 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerStarted","Data":"140cd1bf43680dead001ad4842538a112dff9788b52388f0cf882be3bc453ecb"} Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.918874 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerStarted","Data":"31dfa71bc01f639459ed12ff7a4ef1861cfc1635e6ae2510a8ec1175b2fe017d"} Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.918927 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerStarted","Data":"3732390d04803e9956bbd0401f181beaa73a7c7d829fc7ebddfa45bd73b8a0a8"} Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.918979 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hkvd5" event={"ID":"72b5889e-3eab-414e-ad5d-f6a74b2ec5fe","Type":"ContainerStarted","Data":"88f3f9b910c50e5b49d4a50beb1c3369801395af8b65c73f4fbb15491f76a513"} Mar 11 19:05:05 crc kubenswrapper[4842]: I0311 19:05:05.948381 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hkvd5" podStartSLOduration=5.345095485 podStartE2EDuration="11.948362839s" podCreationTimestamp="2026-03-11 19:04:54 +0000 UTC" firstStartedPulling="2026-03-11 19:04:55.953080027 +0000 UTC m=+941.600776307" lastFinishedPulling="2026-03-11 19:05:02.556347361 +0000 UTC m=+948.204043661" observedRunningTime="2026-03-11 19:05:05.939586559 +0000 UTC m=+951.587282849" watchObservedRunningTime="2026-03-11 19:05:05.948362839 +0000 UTC m=+951.596059119" Mar 11 19:05:10 crc kubenswrapper[4842]: I0311 19:05:10.842867 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:05:10 crc kubenswrapper[4842]: I0311 19:05:10.891341 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:05:15 crc kubenswrapper[4842]: I0311 19:05:15.233630 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kmmqf" Mar 11 19:05:15 crc kubenswrapper[4842]: I0311 19:05:15.844403 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hkvd5" Mar 11 19:05:15 crc kubenswrapper[4842]: I0311 19:05:15.911482 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-qgmvl" Mar 11 19:05:16 crc kubenswrapper[4842]: I0311 19:05:16.797676 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8h7pw" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.120682 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v"] Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.123697 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.129079 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.143945 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v"] Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.216786 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.216873 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62t7k\" (UniqueName: \"kubernetes.io/projected/65175dce-175c-44e3-b1f1-a3f3607e0a25-kube-api-access-62t7k\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.216904 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.318516 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.318582 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62t7k\" (UniqueName: \"kubernetes.io/projected/65175dce-175c-44e3-b1f1-a3f3607e0a25-kube-api-access-62t7k\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.319026 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.319061 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.319212 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.337544 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62t7k\" (UniqueName: \"kubernetes.io/projected/65175dce-175c-44e3-b1f1-a3f3607e0a25-kube-api-access-62t7k\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.438529 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:18 crc kubenswrapper[4842]: I0311 19:05:18.861099 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v"] Mar 11 19:05:19 crc kubenswrapper[4842]: I0311 19:05:19.007519 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" event={"ID":"65175dce-175c-44e3-b1f1-a3f3607e0a25","Type":"ContainerStarted","Data":"20855c040d59836f6f0884642282a01bf6e7f4a976aa6d9251ef973c70e19780"} Mar 11 19:05:19 crc kubenswrapper[4842]: I0311 19:05:19.007562 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" event={"ID":"65175dce-175c-44e3-b1f1-a3f3607e0a25","Type":"ContainerStarted","Data":"031bff5008a37151d3a339cc52db92be27c696b630841bbaef9854b5d5abcde9"} Mar 11 19:05:20 crc kubenswrapper[4842]: I0311 19:05:20.016390 4842 generic.go:334] "Generic (PLEG): container finished" podID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerID="20855c040d59836f6f0884642282a01bf6e7f4a976aa6d9251ef973c70e19780" exitCode=0 Mar 11 19:05:20 crc kubenswrapper[4842]: I0311 19:05:20.016444 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" event={"ID":"65175dce-175c-44e3-b1f1-a3f3607e0a25","Type":"ContainerDied","Data":"20855c040d59836f6f0884642282a01bf6e7f4a976aa6d9251ef973c70e19780"} Mar 11 19:05:20 crc kubenswrapper[4842]: I0311 19:05:20.019258 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 19:05:24 crc kubenswrapper[4842]: I0311 19:05:24.065749 4842 generic.go:334] "Generic (PLEG): container finished" podID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerID="0a9ca6f2c7e3047a11ec9e064ffe77f0eec36de2e4bfcbf4c4cf702463a13723" exitCode=0 Mar 11 19:05:24 crc kubenswrapper[4842]: I0311 19:05:24.065841 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" event={"ID":"65175dce-175c-44e3-b1f1-a3f3607e0a25","Type":"ContainerDied","Data":"0a9ca6f2c7e3047a11ec9e064ffe77f0eec36de2e4bfcbf4c4cf702463a13723"} Mar 11 19:05:25 crc kubenswrapper[4842]: I0311 19:05:25.076361 4842 generic.go:334] "Generic (PLEG): container finished" podID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerID="e6f796090d80cd69ac84084adea82702b8dd6b7b09953467b08440628f4d487c" exitCode=0 Mar 11 19:05:25 crc kubenswrapper[4842]: I0311 19:05:25.076441 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" event={"ID":"65175dce-175c-44e3-b1f1-a3f3607e0a25","Type":"ContainerDied","Data":"e6f796090d80cd69ac84084adea82702b8dd6b7b09953467b08440628f4d487c"} Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.390680 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.422406 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62t7k\" (UniqueName: \"kubernetes.io/projected/65175dce-175c-44e3-b1f1-a3f3607e0a25-kube-api-access-62t7k\") pod \"65175dce-175c-44e3-b1f1-a3f3607e0a25\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.427755 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65175dce-175c-44e3-b1f1-a3f3607e0a25-kube-api-access-62t7k" (OuterVolumeSpecName: "kube-api-access-62t7k") pod "65175dce-175c-44e3-b1f1-a3f3607e0a25" (UID: "65175dce-175c-44e3-b1f1-a3f3607e0a25"). InnerVolumeSpecName "kube-api-access-62t7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.523329 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-bundle\") pod \"65175dce-175c-44e3-b1f1-a3f3607e0a25\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.523404 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-util\") pod \"65175dce-175c-44e3-b1f1-a3f3607e0a25\" (UID: \"65175dce-175c-44e3-b1f1-a3f3607e0a25\") " Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.523738 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62t7k\" (UniqueName: \"kubernetes.io/projected/65175dce-175c-44e3-b1f1-a3f3607e0a25-kube-api-access-62t7k\") on node \"crc\" DevicePath \"\"" Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.524482 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-bundle" (OuterVolumeSpecName: "bundle") pod "65175dce-175c-44e3-b1f1-a3f3607e0a25" (UID: "65175dce-175c-44e3-b1f1-a3f3607e0a25"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.535761 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-util" (OuterVolumeSpecName: "util") pod "65175dce-175c-44e3-b1f1-a3f3607e0a25" (UID: "65175dce-175c-44e3-b1f1-a3f3607e0a25"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.624536 4842 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:05:26 crc kubenswrapper[4842]: I0311 19:05:26.624851 4842 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/65175dce-175c-44e3-b1f1-a3f3607e0a25-util\") on node \"crc\" DevicePath \"\"" Mar 11 19:05:27 crc kubenswrapper[4842]: I0311 19:05:27.088311 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" event={"ID":"65175dce-175c-44e3-b1f1-a3f3607e0a25","Type":"ContainerDied","Data":"031bff5008a37151d3a339cc52db92be27c696b630841bbaef9854b5d5abcde9"} Mar 11 19:05:27 crc kubenswrapper[4842]: I0311 19:05:27.088349 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031bff5008a37151d3a339cc52db92be27c696b630841bbaef9854b5d5abcde9" Mar 11 19:05:27 crc kubenswrapper[4842]: I0311 19:05:27.088404 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.938564 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb"] Mar 11 19:05:30 crc kubenswrapper[4842]: E0311 19:05:30.940129 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerName="util" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.940150 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerName="util" Mar 11 19:05:30 crc kubenswrapper[4842]: E0311 19:05:30.940166 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerName="pull" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.940174 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerName="pull" Mar 11 19:05:30 crc kubenswrapper[4842]: E0311 19:05:30.940191 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerName="extract" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.940197 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerName="extract" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.940357 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="65175dce-175c-44e3-b1f1-a3f3607e0a25" containerName="extract" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.940951 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.945634 4842 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-lbrd5" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.951172 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.951396 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 11 19:05:30 crc kubenswrapper[4842]: I0311 19:05:30.957937 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb"] Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.022242 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpp8\" (UniqueName: \"kubernetes.io/projected/58934078-8e0b-4327-8f65-2ae09abf21e7-kube-api-access-bvpp8\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tnnnb\" (UID: \"58934078-8e0b-4327-8f65-2ae09abf21e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.022350 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58934078-8e0b-4327-8f65-2ae09abf21e7-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tnnnb\" (UID: \"58934078-8e0b-4327-8f65-2ae09abf21e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.123148 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58934078-8e0b-4327-8f65-2ae09abf21e7-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tnnnb\" (UID: \"58934078-8e0b-4327-8f65-2ae09abf21e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.123309 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpp8\" (UniqueName: \"kubernetes.io/projected/58934078-8e0b-4327-8f65-2ae09abf21e7-kube-api-access-bvpp8\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tnnnb\" (UID: \"58934078-8e0b-4327-8f65-2ae09abf21e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.123719 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58934078-8e0b-4327-8f65-2ae09abf21e7-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tnnnb\" (UID: \"58934078-8e0b-4327-8f65-2ae09abf21e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.166415 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpp8\" (UniqueName: \"kubernetes.io/projected/58934078-8e0b-4327-8f65-2ae09abf21e7-kube-api-access-bvpp8\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tnnnb\" (UID: \"58934078-8e0b-4327-8f65-2ae09abf21e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.265757 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.474656 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.474875 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:05:31 crc kubenswrapper[4842]: I0311 19:05:31.550707 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb"] Mar 11 19:05:31 crc kubenswrapper[4842]: W0311 19:05:31.566440 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58934078_8e0b_4327_8f65_2ae09abf21e7.slice/crio-1622b309d6add91a5e558916e0833668cdebe52e31cde457b18a04b30aa79a39 WatchSource:0}: Error finding container 1622b309d6add91a5e558916e0833668cdebe52e31cde457b18a04b30aa79a39: Status 404 returned error can't find the container with id 1622b309d6add91a5e558916e0833668cdebe52e31cde457b18a04b30aa79a39 Mar 11 19:05:32 crc kubenswrapper[4842]: I0311 19:05:32.119152 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" event={"ID":"58934078-8e0b-4327-8f65-2ae09abf21e7","Type":"ContainerStarted","Data":"1622b309d6add91a5e558916e0833668cdebe52e31cde457b18a04b30aa79a39"} Mar 11 19:05:35 crc kubenswrapper[4842]: I0311 19:05:35.141171 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" event={"ID":"58934078-8e0b-4327-8f65-2ae09abf21e7","Type":"ContainerStarted","Data":"fcd95e7799703b68f056b933299a87efcb6fb86b3651a18f15d871a0eee6372a"} Mar 11 19:05:35 crc kubenswrapper[4842]: I0311 19:05:35.168051 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tnnnb" podStartSLOduration=2.012538936 podStartE2EDuration="5.168034801s" podCreationTimestamp="2026-03-11 19:05:30 +0000 UTC" firstStartedPulling="2026-03-11 19:05:31.568283465 +0000 UTC m=+977.215979745" lastFinishedPulling="2026-03-11 19:05:34.72377933 +0000 UTC m=+980.371475610" observedRunningTime="2026-03-11 19:05:35.166642535 +0000 UTC m=+980.814338815" watchObservedRunningTime="2026-03-11 19:05:35.168034801 +0000 UTC m=+980.815731081" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.479635 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l8g6h"] Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.481340 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.486449 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8g6h"] Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.503305 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vthhc\" (UniqueName: \"kubernetes.io/projected/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-kube-api-access-vthhc\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.503383 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-catalog-content\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.503411 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-utilities\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.604725 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vthhc\" (UniqueName: \"kubernetes.io/projected/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-kube-api-access-vthhc\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.604810 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-catalog-content\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.604843 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-utilities\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.605404 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-utilities\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.605468 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-catalog-content\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.632241 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vthhc\" (UniqueName: \"kubernetes.io/projected/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-kube-api-access-vthhc\") pod \"community-operators-l8g6h\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:36 crc kubenswrapper[4842]: I0311 19:05:36.799305 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:37 crc kubenswrapper[4842]: I0311 19:05:37.303374 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l8g6h"] Mar 11 19:05:38 crc kubenswrapper[4842]: I0311 19:05:38.157822 4842 generic.go:334] "Generic (PLEG): container finished" podID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerID="46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8" exitCode=0 Mar 11 19:05:38 crc kubenswrapper[4842]: I0311 19:05:38.157860 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g6h" event={"ID":"69ebe1d0-cf71-4151-a6c5-5f5fc516465b","Type":"ContainerDied","Data":"46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8"} Mar 11 19:05:38 crc kubenswrapper[4842]: I0311 19:05:38.159745 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g6h" event={"ID":"69ebe1d0-cf71-4151-a6c5-5f5fc516465b","Type":"ContainerStarted","Data":"675845e27c13025f12d0d9bfe78402e5ab0c5d9cb3bebc092424560490f85525"} Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.094065 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-l4bwp"] Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.101991 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.103849 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.104423 4842 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l2rlp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.111387 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-l4bwp"] Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.111449 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.167984 4842 generic.go:334] "Generic (PLEG): container finished" podID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerID="e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489" exitCode=0 Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.168038 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g6h" event={"ID":"69ebe1d0-cf71-4151-a6c5-5f5fc516465b","Type":"ContainerDied","Data":"e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489"} Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.232810 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8e1262e-1ae7-4979-ae34-605beb8c7c65-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-l4bwp\" (UID: \"c8e1262e-1ae7-4979-ae34-605beb8c7c65\") " pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.232872 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7hk8\" (UniqueName: \"kubernetes.io/projected/c8e1262e-1ae7-4979-ae34-605beb8c7c65-kube-api-access-x7hk8\") pod \"cert-manager-webhook-6888856db4-l4bwp\" (UID: \"c8e1262e-1ae7-4979-ae34-605beb8c7c65\") " pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.252927 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-d2sk4"] Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.253957 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.259715 4842 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-824pl" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.260156 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-d2sk4"] Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.334663 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8f91c28-37fb-4480-aa01-8e5caf168fc4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-d2sk4\" (UID: \"f8f91c28-37fb-4480-aa01-8e5caf168fc4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.334756 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8e1262e-1ae7-4979-ae34-605beb8c7c65-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-l4bwp\" (UID: \"c8e1262e-1ae7-4979-ae34-605beb8c7c65\") " pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.334790 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7hk8\" (UniqueName: \"kubernetes.io/projected/c8e1262e-1ae7-4979-ae34-605beb8c7c65-kube-api-access-x7hk8\") pod \"cert-manager-webhook-6888856db4-l4bwp\" (UID: \"c8e1262e-1ae7-4979-ae34-605beb8c7c65\") " pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.334889 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hhxh\" (UniqueName: \"kubernetes.io/projected/f8f91c28-37fb-4480-aa01-8e5caf168fc4-kube-api-access-4hhxh\") pod \"cert-manager-cainjector-5545bd876-d2sk4\" (UID: \"f8f91c28-37fb-4480-aa01-8e5caf168fc4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.351382 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8e1262e-1ae7-4979-ae34-605beb8c7c65-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-l4bwp\" (UID: \"c8e1262e-1ae7-4979-ae34-605beb8c7c65\") " pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.351758 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7hk8\" (UniqueName: \"kubernetes.io/projected/c8e1262e-1ae7-4979-ae34-605beb8c7c65-kube-api-access-x7hk8\") pod \"cert-manager-webhook-6888856db4-l4bwp\" (UID: \"c8e1262e-1ae7-4979-ae34-605beb8c7c65\") " pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.419514 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.436203 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hhxh\" (UniqueName: \"kubernetes.io/projected/f8f91c28-37fb-4480-aa01-8e5caf168fc4-kube-api-access-4hhxh\") pod \"cert-manager-cainjector-5545bd876-d2sk4\" (UID: \"f8f91c28-37fb-4480-aa01-8e5caf168fc4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.436261 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8f91c28-37fb-4480-aa01-8e5caf168fc4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-d2sk4\" (UID: \"f8f91c28-37fb-4480-aa01-8e5caf168fc4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.461937 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8f91c28-37fb-4480-aa01-8e5caf168fc4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-d2sk4\" (UID: \"f8f91c28-37fb-4480-aa01-8e5caf168fc4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.462066 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hhxh\" (UniqueName: \"kubernetes.io/projected/f8f91c28-37fb-4480-aa01-8e5caf168fc4-kube-api-access-4hhxh\") pod \"cert-manager-cainjector-5545bd876-d2sk4\" (UID: \"f8f91c28-37fb-4480-aa01-8e5caf168fc4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.578667 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.850804 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-l4bwp"] Mar 11 19:05:39 crc kubenswrapper[4842]: W0311 19:05:39.858524 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8e1262e_1ae7_4979_ae34_605beb8c7c65.slice/crio-54998b29ec27c82f9383ea4407c63dced17f5ab65fa3569225871035e90aac82 WatchSource:0}: Error finding container 54998b29ec27c82f9383ea4407c63dced17f5ab65fa3569225871035e90aac82: Status 404 returned error can't find the container with id 54998b29ec27c82f9383ea4407c63dced17f5ab65fa3569225871035e90aac82 Mar 11 19:05:39 crc kubenswrapper[4842]: I0311 19:05:39.973185 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-d2sk4"] Mar 11 19:05:39 crc kubenswrapper[4842]: W0311 19:05:39.979641 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8f91c28_37fb_4480_aa01_8e5caf168fc4.slice/crio-c6dc529f262b8b4de97dcce80ddd8a8712606f14adb68b7eebc80cbb725d815d WatchSource:0}: Error finding container c6dc529f262b8b4de97dcce80ddd8a8712606f14adb68b7eebc80cbb725d815d: Status 404 returned error can't find the container with id c6dc529f262b8b4de97dcce80ddd8a8712606f14adb68b7eebc80cbb725d815d Mar 11 19:05:40 crc kubenswrapper[4842]: I0311 19:05:40.175830 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g6h" event={"ID":"69ebe1d0-cf71-4151-a6c5-5f5fc516465b","Type":"ContainerStarted","Data":"b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a"} Mar 11 19:05:40 crc kubenswrapper[4842]: I0311 19:05:40.176901 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" event={"ID":"c8e1262e-1ae7-4979-ae34-605beb8c7c65","Type":"ContainerStarted","Data":"54998b29ec27c82f9383ea4407c63dced17f5ab65fa3569225871035e90aac82"} Mar 11 19:05:40 crc kubenswrapper[4842]: I0311 19:05:40.178040 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" event={"ID":"f8f91c28-37fb-4480-aa01-8e5caf168fc4","Type":"ContainerStarted","Data":"c6dc529f262b8b4de97dcce80ddd8a8712606f14adb68b7eebc80cbb725d815d"} Mar 11 19:05:40 crc kubenswrapper[4842]: I0311 19:05:40.190946 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l8g6h" podStartSLOduration=2.754321009 podStartE2EDuration="4.190926346s" podCreationTimestamp="2026-03-11 19:05:36 +0000 UTC" firstStartedPulling="2026-03-11 19:05:38.15906545 +0000 UTC m=+983.806761730" lastFinishedPulling="2026-03-11 19:05:39.595670787 +0000 UTC m=+985.243367067" observedRunningTime="2026-03-11 19:05:40.189662803 +0000 UTC m=+985.837359093" watchObservedRunningTime="2026-03-11 19:05:40.190926346 +0000 UTC m=+985.838622646" Mar 11 19:05:45 crc kubenswrapper[4842]: I0311 19:05:45.229558 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" event={"ID":"c8e1262e-1ae7-4979-ae34-605beb8c7c65","Type":"ContainerStarted","Data":"78f5f72ddd248ba926062fc745728352b9f57147625be928ed6efa0dcc42f002"} Mar 11 19:05:45 crc kubenswrapper[4842]: I0311 19:05:45.230867 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:45 crc kubenswrapper[4842]: I0311 19:05:45.232732 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" event={"ID":"f8f91c28-37fb-4480-aa01-8e5caf168fc4","Type":"ContainerStarted","Data":"bb52e94302f98851c37c18941415ccf77885e4990718db772e9f7e0cca0530b0"} Mar 11 19:05:45 crc kubenswrapper[4842]: I0311 19:05:45.248863 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" podStartSLOduration=1.6181784559999999 podStartE2EDuration="6.248844111s" podCreationTimestamp="2026-03-11 19:05:39 +0000 UTC" firstStartedPulling="2026-03-11 19:05:39.860553325 +0000 UTC m=+985.508249605" lastFinishedPulling="2026-03-11 19:05:44.49121898 +0000 UTC m=+990.138915260" observedRunningTime="2026-03-11 19:05:45.248720128 +0000 UTC m=+990.896416438" watchObservedRunningTime="2026-03-11 19:05:45.248844111 +0000 UTC m=+990.896540391" Mar 11 19:05:45 crc kubenswrapper[4842]: I0311 19:05:45.266695 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-d2sk4" podStartSLOduration=1.772351165 podStartE2EDuration="6.266675355s" podCreationTimestamp="2026-03-11 19:05:39 +0000 UTC" firstStartedPulling="2026-03-11 19:05:39.982248559 +0000 UTC m=+985.629944839" lastFinishedPulling="2026-03-11 19:05:44.476572749 +0000 UTC m=+990.124269029" observedRunningTime="2026-03-11 19:05:45.262497976 +0000 UTC m=+990.910194266" watchObservedRunningTime="2026-03-11 19:05:45.266675355 +0000 UTC m=+990.914371635" Mar 11 19:05:46 crc kubenswrapper[4842]: I0311 19:05:46.799892 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:46 crc kubenswrapper[4842]: I0311 19:05:46.800173 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:46 crc kubenswrapper[4842]: I0311 19:05:46.838146 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:47 crc kubenswrapper[4842]: I0311 19:05:47.295264 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.071958 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8g6h"] Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.266254 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l8g6h" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="registry-server" containerID="cri-o://b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a" gracePeriod=2 Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.422179 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-l4bwp" Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.655452 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.787571 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-catalog-content\") pod \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.787647 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vthhc\" (UniqueName: \"kubernetes.io/projected/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-kube-api-access-vthhc\") pod \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.787711 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-utilities\") pod \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\" (UID: \"69ebe1d0-cf71-4151-a6c5-5f5fc516465b\") " Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.788664 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-utilities" (OuterVolumeSpecName: "utilities") pod "69ebe1d0-cf71-4151-a6c5-5f5fc516465b" (UID: "69ebe1d0-cf71-4151-a6c5-5f5fc516465b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.792977 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-kube-api-access-vthhc" (OuterVolumeSpecName: "kube-api-access-vthhc") pod "69ebe1d0-cf71-4151-a6c5-5f5fc516465b" (UID: "69ebe1d0-cf71-4151-a6c5-5f5fc516465b"). InnerVolumeSpecName "kube-api-access-vthhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.889050 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vthhc\" (UniqueName: \"kubernetes.io/projected/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-kube-api-access-vthhc\") on node \"crc\" DevicePath \"\"" Mar 11 19:05:49 crc kubenswrapper[4842]: I0311 19:05:49.889085 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.276990 4842 generic.go:334] "Generic (PLEG): container finished" podID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerID="b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a" exitCode=0 Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.277049 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g6h" event={"ID":"69ebe1d0-cf71-4151-a6c5-5f5fc516465b","Type":"ContainerDied","Data":"b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a"} Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.277085 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l8g6h" event={"ID":"69ebe1d0-cf71-4151-a6c5-5f5fc516465b","Type":"ContainerDied","Data":"675845e27c13025f12d0d9bfe78402e5ab0c5d9cb3bebc092424560490f85525"} Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.277112 4842 scope.go:117] "RemoveContainer" containerID="b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.277302 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l8g6h" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.302042 4842 scope.go:117] "RemoveContainer" containerID="e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.317482 4842 scope.go:117] "RemoveContainer" containerID="46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.349460 4842 scope.go:117] "RemoveContainer" containerID="b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a" Mar 11 19:05:50 crc kubenswrapper[4842]: E0311 19:05:50.349965 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a\": container with ID starting with b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a not found: ID does not exist" containerID="b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.350026 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a"} err="failed to get container status \"b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a\": rpc error: code = NotFound desc = could not find container \"b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a\": container with ID starting with b9169de8ff1d791dfa40b1f34f52abec8e150a16eeeccfa143e6cc5b794e467a not found: ID does not exist" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.350058 4842 scope.go:117] "RemoveContainer" containerID="e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489" Mar 11 19:05:50 crc kubenswrapper[4842]: E0311 19:05:50.350421 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489\": container with ID starting with e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489 not found: ID does not exist" containerID="e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.350448 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489"} err="failed to get container status \"e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489\": rpc error: code = NotFound desc = could not find container \"e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489\": container with ID starting with e35cd4b649f97cc391c8dfdec904589a9afe7210aa16bd1cdbe1ecad4d733489 not found: ID does not exist" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.350472 4842 scope.go:117] "RemoveContainer" containerID="46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8" Mar 11 19:05:50 crc kubenswrapper[4842]: E0311 19:05:50.350897 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8\": container with ID starting with 46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8 not found: ID does not exist" containerID="46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8" Mar 11 19:05:50 crc kubenswrapper[4842]: I0311 19:05:50.350942 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8"} err="failed to get container status \"46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8\": rpc error: code = NotFound desc = could not find container \"46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8\": container with ID starting with 46440c077ff4c38f4aea8712e1b5a7a1034d4050aa71d409b825d27c9dca38a8 not found: ID does not exist" Mar 11 19:05:51 crc kubenswrapper[4842]: I0311 19:05:51.101063 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69ebe1d0-cf71-4151-a6c5-5f5fc516465b" (UID: "69ebe1d0-cf71-4151-a6c5-5f5fc516465b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:05:51 crc kubenswrapper[4842]: I0311 19:05:51.118836 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69ebe1d0-cf71-4151-a6c5-5f5fc516465b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:05:51 crc kubenswrapper[4842]: I0311 19:05:51.213176 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l8g6h"] Mar 11 19:05:51 crc kubenswrapper[4842]: I0311 19:05:51.217973 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l8g6h"] Mar 11 19:05:52 crc kubenswrapper[4842]: I0311 19:05:52.975751 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" path="/var/lib/kubelet/pods/69ebe1d0-cf71-4151-a6c5-5f5fc516465b/volumes" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.963612 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-xwpqh"] Mar 11 19:05:53 crc kubenswrapper[4842]: E0311 19:05:53.963898 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="extract-utilities" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.963919 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="extract-utilities" Mar 11 19:05:53 crc kubenswrapper[4842]: E0311 19:05:53.963931 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="registry-server" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.963939 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="registry-server" Mar 11 19:05:53 crc kubenswrapper[4842]: E0311 19:05:53.963971 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="extract-content" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.963979 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="extract-content" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.964123 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ebe1d0-cf71-4151-a6c5-5f5fc516465b" containerName="registry-server" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.964613 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.968815 4842 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-mfdrv" Mar 11 19:05:53 crc kubenswrapper[4842]: I0311 19:05:53.986557 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-xwpqh"] Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.057113 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/921b3579-f653-4d22-8118-31ed2c6ed61c-bound-sa-token\") pod \"cert-manager-545d4d4674-xwpqh\" (UID: \"921b3579-f653-4d22-8118-31ed2c6ed61c\") " pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.057431 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mdq\" (UniqueName: \"kubernetes.io/projected/921b3579-f653-4d22-8118-31ed2c6ed61c-kube-api-access-92mdq\") pod \"cert-manager-545d4d4674-xwpqh\" (UID: \"921b3579-f653-4d22-8118-31ed2c6ed61c\") " pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.158724 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/921b3579-f653-4d22-8118-31ed2c6ed61c-bound-sa-token\") pod \"cert-manager-545d4d4674-xwpqh\" (UID: \"921b3579-f653-4d22-8118-31ed2c6ed61c\") " pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.158767 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92mdq\" (UniqueName: \"kubernetes.io/projected/921b3579-f653-4d22-8118-31ed2c6ed61c-kube-api-access-92mdq\") pod \"cert-manager-545d4d4674-xwpqh\" (UID: \"921b3579-f653-4d22-8118-31ed2c6ed61c\") " pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.175953 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/921b3579-f653-4d22-8118-31ed2c6ed61c-bound-sa-token\") pod \"cert-manager-545d4d4674-xwpqh\" (UID: \"921b3579-f653-4d22-8118-31ed2c6ed61c\") " pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.178960 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mdq\" (UniqueName: \"kubernetes.io/projected/921b3579-f653-4d22-8118-31ed2c6ed61c-kube-api-access-92mdq\") pod \"cert-manager-545d4d4674-xwpqh\" (UID: \"921b3579-f653-4d22-8118-31ed2c6ed61c\") " pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.285346 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-xwpqh" Mar 11 19:05:54 crc kubenswrapper[4842]: I0311 19:05:54.730092 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-xwpqh"] Mar 11 19:05:55 crc kubenswrapper[4842]: I0311 19:05:55.309544 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-xwpqh" event={"ID":"921b3579-f653-4d22-8118-31ed2c6ed61c","Type":"ContainerStarted","Data":"91323ef0dfd5a0bcdeb5c1dd990e5ac7bdd852fdf405511f5c0df3a9b891cd11"} Mar 11 19:05:55 crc kubenswrapper[4842]: I0311 19:05:55.309812 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-xwpqh" event={"ID":"921b3579-f653-4d22-8118-31ed2c6ed61c","Type":"ContainerStarted","Data":"2102da64ac274dea900595aecee55a9967953f484ec5e5f4d03cb2e271400fcf"} Mar 11 19:05:55 crc kubenswrapper[4842]: I0311 19:05:55.328927 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-xwpqh" podStartSLOduration=2.328910051 podStartE2EDuration="2.328910051s" podCreationTimestamp="2026-03-11 19:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:05:55.325741469 +0000 UTC m=+1000.973437749" watchObservedRunningTime="2026-03-11 19:05:55.328910051 +0000 UTC m=+1000.976606331" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.125264 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554266-x7vxv"] Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.126886 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554266-x7vxv" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.129373 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.129668 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.129838 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.135051 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554266-x7vxv"] Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.235418 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79jfv\" (UniqueName: \"kubernetes.io/projected/f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07-kube-api-access-79jfv\") pod \"auto-csr-approver-29554266-x7vxv\" (UID: \"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07\") " pod="openshift-infra/auto-csr-approver-29554266-x7vxv" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.336454 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79jfv\" (UniqueName: \"kubernetes.io/projected/f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07-kube-api-access-79jfv\") pod \"auto-csr-approver-29554266-x7vxv\" (UID: \"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07\") " pod="openshift-infra/auto-csr-approver-29554266-x7vxv" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.377088 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79jfv\" (UniqueName: \"kubernetes.io/projected/f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07-kube-api-access-79jfv\") pod \"auto-csr-approver-29554266-x7vxv\" (UID: \"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07\") " pod="openshift-infra/auto-csr-approver-29554266-x7vxv" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.445061 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554266-x7vxv" Mar 11 19:06:00 crc kubenswrapper[4842]: I0311 19:06:00.896597 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554266-x7vxv"] Mar 11 19:06:01 crc kubenswrapper[4842]: I0311 19:06:01.346111 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554266-x7vxv" event={"ID":"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07","Type":"ContainerStarted","Data":"ec54c4be8c486ba406cc83418f2ed12d5a80370bbe146be27ffe9f2a01186003"} Mar 11 19:06:01 crc kubenswrapper[4842]: I0311 19:06:01.471878 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:06:01 crc kubenswrapper[4842]: I0311 19:06:01.471940 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:06:01 crc kubenswrapper[4842]: I0311 19:06:01.471979 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:06:01 crc kubenswrapper[4842]: I0311 19:06:01.472579 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"741cee34022ccdac48b6c603ba201ced3a7f7803c4c8a38143440982a01cfafb"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:06:01 crc kubenswrapper[4842]: I0311 19:06:01.472664 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://741cee34022ccdac48b6c603ba201ced3a7f7803c4c8a38143440982a01cfafb" gracePeriod=600 Mar 11 19:06:02 crc kubenswrapper[4842]: I0311 19:06:02.357847 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="741cee34022ccdac48b6c603ba201ced3a7f7803c4c8a38143440982a01cfafb" exitCode=0 Mar 11 19:06:02 crc kubenswrapper[4842]: I0311 19:06:02.357927 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"741cee34022ccdac48b6c603ba201ced3a7f7803c4c8a38143440982a01cfafb"} Mar 11 19:06:02 crc kubenswrapper[4842]: I0311 19:06:02.359847 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"d0ede0d62e1ca5af886d8ad032f52cc79f17aa7c91031e5d4935ed627d33421d"} Mar 11 19:06:02 crc kubenswrapper[4842]: I0311 19:06:02.359878 4842 scope.go:117] "RemoveContainer" containerID="dd17d155f0763fe0e3f142ca18755ab7a2e8fd0c5e83a7bdd2e0037d15a4c528" Mar 11 19:06:03 crc kubenswrapper[4842]: I0311 19:06:03.366819 4842 generic.go:334] "Generic (PLEG): container finished" podID="f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07" containerID="2bbe9e150869057fecfd22614ea7ace399559e54340b1bcac4f70859b35233a2" exitCode=0 Mar 11 19:06:03 crc kubenswrapper[4842]: I0311 19:06:03.366874 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554266-x7vxv" event={"ID":"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07","Type":"ContainerDied","Data":"2bbe9e150869057fecfd22614ea7ace399559e54340b1bcac4f70859b35233a2"} Mar 11 19:06:04 crc kubenswrapper[4842]: I0311 19:06:04.609705 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554266-x7vxv" Mar 11 19:06:04 crc kubenswrapper[4842]: I0311 19:06:04.700379 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79jfv\" (UniqueName: \"kubernetes.io/projected/f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07-kube-api-access-79jfv\") pod \"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07\" (UID: \"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07\") " Mar 11 19:06:04 crc kubenswrapper[4842]: I0311 19:06:04.706226 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07-kube-api-access-79jfv" (OuterVolumeSpecName: "kube-api-access-79jfv") pod "f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07" (UID: "f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07"). InnerVolumeSpecName "kube-api-access-79jfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:06:04 crc kubenswrapper[4842]: I0311 19:06:04.802694 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79jfv\" (UniqueName: \"kubernetes.io/projected/f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07-kube-api-access-79jfv\") on node \"crc\" DevicePath \"\"" Mar 11 19:06:05 crc kubenswrapper[4842]: I0311 19:06:05.383853 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554266-x7vxv" event={"ID":"f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07","Type":"ContainerDied","Data":"ec54c4be8c486ba406cc83418f2ed12d5a80370bbe146be27ffe9f2a01186003"} Mar 11 19:06:05 crc kubenswrapper[4842]: I0311 19:06:05.384405 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec54c4be8c486ba406cc83418f2ed12d5a80370bbe146be27ffe9f2a01186003" Mar 11 19:06:05 crc kubenswrapper[4842]: I0311 19:06:05.384009 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554266-x7vxv" Mar 11 19:06:05 crc kubenswrapper[4842]: I0311 19:06:05.657843 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554260-bkntv"] Mar 11 19:06:05 crc kubenswrapper[4842]: I0311 19:06:05.662799 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554260-bkntv"] Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.083394 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hvhrl"] Mar 11 19:06:06 crc kubenswrapper[4842]: E0311 19:06:06.083680 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07" containerName="oc" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.083697 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07" containerName="oc" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.083844 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07" containerName="oc" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.084319 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.086648 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.087900 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-knhht" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.088153 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.092830 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvhrl"] Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.118535 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwz9z\" (UniqueName: \"kubernetes.io/projected/1b6f8f46-7c23-4380-b8e7-585c3e32ab04-kube-api-access-vwz9z\") pod \"openstack-operator-index-hvhrl\" (UID: \"1b6f8f46-7c23-4380-b8e7-585c3e32ab04\") " pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.219568 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwz9z\" (UniqueName: \"kubernetes.io/projected/1b6f8f46-7c23-4380-b8e7-585c3e32ab04-kube-api-access-vwz9z\") pod \"openstack-operator-index-hvhrl\" (UID: \"1b6f8f46-7c23-4380-b8e7-585c3e32ab04\") " pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.241691 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwz9z\" (UniqueName: \"kubernetes.io/projected/1b6f8f46-7c23-4380-b8e7-585c3e32ab04-kube-api-access-vwz9z\") pod \"openstack-operator-index-hvhrl\" (UID: \"1b6f8f46-7c23-4380-b8e7-585c3e32ab04\") " pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.401596 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.789851 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvhrl"] Mar 11 19:06:06 crc kubenswrapper[4842]: I0311 19:06:06.975667 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6198d49-2e45-41da-9980-ada387fc0276" path="/var/lib/kubelet/pods/a6198d49-2e45-41da-9980-ada387fc0276/volumes" Mar 11 19:06:07 crc kubenswrapper[4842]: I0311 19:06:07.396338 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvhrl" event={"ID":"1b6f8f46-7c23-4380-b8e7-585c3e32ab04","Type":"ContainerStarted","Data":"76f901593c3dddde52b7efcc5622d8ef373a41482b07d3ba66e969288a31489a"} Mar 11 19:06:09 crc kubenswrapper[4842]: I0311 19:06:09.409077 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvhrl" event={"ID":"1b6f8f46-7c23-4380-b8e7-585c3e32ab04","Type":"ContainerStarted","Data":"47a7434484ea633e0e6a027e2b6b046d8847ab89f7f546b1627230d5b585e515"} Mar 11 19:06:16 crc kubenswrapper[4842]: I0311 19:06:16.402035 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:16 crc kubenswrapper[4842]: I0311 19:06:16.402686 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:16 crc kubenswrapper[4842]: I0311 19:06:16.438248 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:16 crc kubenswrapper[4842]: I0311 19:06:16.455926 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hvhrl" podStartSLOduration=8.043018499 podStartE2EDuration="10.455908513s" podCreationTimestamp="2026-03-11 19:06:06 +0000 UTC" firstStartedPulling="2026-03-11 19:06:06.796311637 +0000 UTC m=+1012.444007917" lastFinishedPulling="2026-03-11 19:06:09.209201651 +0000 UTC m=+1014.856897931" observedRunningTime="2026-03-11 19:06:09.425803674 +0000 UTC m=+1015.073499954" watchObservedRunningTime="2026-03-11 19:06:16.455908513 +0000 UTC m=+1022.103604803" Mar 11 19:06:16 crc kubenswrapper[4842]: I0311 19:06:16.748670 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hvhrl" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.560198 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6"] Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.561747 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.564399 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pg4cf" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.582925 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6"] Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.626967 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdbnh\" (UniqueName: \"kubernetes.io/projected/e60d545e-d480-44f7-8c67-bba9975dd402-kube-api-access-rdbnh\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.627146 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-util\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.627210 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-bundle\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.727789 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdbnh\" (UniqueName: \"kubernetes.io/projected/e60d545e-d480-44f7-8c67-bba9975dd402-kube-api-access-rdbnh\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.727843 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-util\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.727865 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-bundle\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.728318 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-bundle\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.728347 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-util\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.746887 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdbnh\" (UniqueName: \"kubernetes.io/projected/e60d545e-d480-44f7-8c67-bba9975dd402-kube-api-access-rdbnh\") pod \"6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:21 crc kubenswrapper[4842]: I0311 19:06:21.877609 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.294853 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6"] Mar 11 19:06:22 crc kubenswrapper[4842]: W0311 19:06:22.297505 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60d545e_d480_44f7_8c67_bba9975dd402.slice/crio-81f095ce0069615a53da5ce14da9e6f5c5fd75a1520860dc4b60c1723c429194 WatchSource:0}: Error finding container 81f095ce0069615a53da5ce14da9e6f5c5fd75a1520860dc4b60c1723c429194: Status 404 returned error can't find the container with id 81f095ce0069615a53da5ce14da9e6f5c5fd75a1520860dc4b60c1723c429194 Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.495163 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zcnxm"] Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.497585 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.506054 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zcnxm"] Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.537521 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-utilities\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.537567 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hwn6\" (UniqueName: \"kubernetes.io/projected/44630950-e31c-4734-9829-8ab16b54c9ee-kube-api-access-2hwn6\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.537633 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-catalog-content\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.639444 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-utilities\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.639491 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hwn6\" (UniqueName: \"kubernetes.io/projected/44630950-e31c-4734-9829-8ab16b54c9ee-kube-api-access-2hwn6\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.639547 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-catalog-content\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.640081 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-utilities\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.640123 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-catalog-content\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.663712 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hwn6\" (UniqueName: \"kubernetes.io/projected/44630950-e31c-4734-9829-8ab16b54c9ee-kube-api-access-2hwn6\") pod \"certified-operators-zcnxm\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.755193 4842 generic.go:334] "Generic (PLEG): container finished" podID="e60d545e-d480-44f7-8c67-bba9975dd402" containerID="1096470eaff0798131e72a78f79484a7fdd3c04d0b12b44525a0bf80394c5b77" exitCode=0 Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.755244 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" event={"ID":"e60d545e-d480-44f7-8c67-bba9975dd402","Type":"ContainerDied","Data":"1096470eaff0798131e72a78f79484a7fdd3c04d0b12b44525a0bf80394c5b77"} Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.755309 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" event={"ID":"e60d545e-d480-44f7-8c67-bba9975dd402","Type":"ContainerStarted","Data":"81f095ce0069615a53da5ce14da9e6f5c5fd75a1520860dc4b60c1723c429194"} Mar 11 19:06:22 crc kubenswrapper[4842]: I0311 19:06:22.818723 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:23 crc kubenswrapper[4842]: I0311 19:06:23.279592 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zcnxm"] Mar 11 19:06:23 crc kubenswrapper[4842]: I0311 19:06:23.764533 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" event={"ID":"e60d545e-d480-44f7-8c67-bba9975dd402","Type":"ContainerDied","Data":"c9ea9237907f3bd6275e5f533f50763384e2834c5ba3c199d38f23ee8fd69821"} Mar 11 19:06:23 crc kubenswrapper[4842]: I0311 19:06:23.764508 4842 generic.go:334] "Generic (PLEG): container finished" podID="e60d545e-d480-44f7-8c67-bba9975dd402" containerID="c9ea9237907f3bd6275e5f533f50763384e2834c5ba3c199d38f23ee8fd69821" exitCode=0 Mar 11 19:06:23 crc kubenswrapper[4842]: I0311 19:06:23.767177 4842 generic.go:334] "Generic (PLEG): container finished" podID="44630950-e31c-4734-9829-8ab16b54c9ee" containerID="fa57691dfd6e11d838ea8bfe910434e97720a406ea89401050374eb69c8883e3" exitCode=0 Mar 11 19:06:23 crc kubenswrapper[4842]: I0311 19:06:23.767231 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcnxm" event={"ID":"44630950-e31c-4734-9829-8ab16b54c9ee","Type":"ContainerDied","Data":"fa57691dfd6e11d838ea8bfe910434e97720a406ea89401050374eb69c8883e3"} Mar 11 19:06:23 crc kubenswrapper[4842]: I0311 19:06:23.767358 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcnxm" event={"ID":"44630950-e31c-4734-9829-8ab16b54c9ee","Type":"ContainerStarted","Data":"05f9cce80b6dd528659f9ae8475c26e97b1887ba5af087e7d0688cadcb778708"} Mar 11 19:06:24 crc kubenswrapper[4842]: I0311 19:06:24.028762 4842 scope.go:117] "RemoveContainer" containerID="fce9e339213a9e76d6851137264b2cd4b9473d4dca280a11a64445a80425708a" Mar 11 19:06:24 crc kubenswrapper[4842]: I0311 19:06:24.774743 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcnxm" event={"ID":"44630950-e31c-4734-9829-8ab16b54c9ee","Type":"ContainerStarted","Data":"e5599e9e18a3ef3b35860affa10bafa3681c0a8d6c8df33f2d0e08c7ad9cc809"} Mar 11 19:06:24 crc kubenswrapper[4842]: I0311 19:06:24.777572 4842 generic.go:334] "Generic (PLEG): container finished" podID="e60d545e-d480-44f7-8c67-bba9975dd402" containerID="ab3ca89f4313e45ea2d30a2795e80b4d011dd93c471bd906cd43fc33aa2b57dc" exitCode=0 Mar 11 19:06:24 crc kubenswrapper[4842]: I0311 19:06:24.777650 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" event={"ID":"e60d545e-d480-44f7-8c67-bba9975dd402","Type":"ContainerDied","Data":"ab3ca89f4313e45ea2d30a2795e80b4d011dd93c471bd906cd43fc33aa2b57dc"} Mar 11 19:06:25 crc kubenswrapper[4842]: I0311 19:06:25.785250 4842 generic.go:334] "Generic (PLEG): container finished" podID="44630950-e31c-4734-9829-8ab16b54c9ee" containerID="e5599e9e18a3ef3b35860affa10bafa3681c0a8d6c8df33f2d0e08c7ad9cc809" exitCode=0 Mar 11 19:06:25 crc kubenswrapper[4842]: I0311 19:06:25.786187 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcnxm" event={"ID":"44630950-e31c-4734-9829-8ab16b54c9ee","Type":"ContainerDied","Data":"e5599e9e18a3ef3b35860affa10bafa3681c0a8d6c8df33f2d0e08c7ad9cc809"} Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.078723 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.193191 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdbnh\" (UniqueName: \"kubernetes.io/projected/e60d545e-d480-44f7-8c67-bba9975dd402-kube-api-access-rdbnh\") pod \"e60d545e-d480-44f7-8c67-bba9975dd402\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.193291 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-util\") pod \"e60d545e-d480-44f7-8c67-bba9975dd402\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.193340 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-bundle\") pod \"e60d545e-d480-44f7-8c67-bba9975dd402\" (UID: \"e60d545e-d480-44f7-8c67-bba9975dd402\") " Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.194075 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-bundle" (OuterVolumeSpecName: "bundle") pod "e60d545e-d480-44f7-8c67-bba9975dd402" (UID: "e60d545e-d480-44f7-8c67-bba9975dd402"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.198758 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60d545e-d480-44f7-8c67-bba9975dd402-kube-api-access-rdbnh" (OuterVolumeSpecName: "kube-api-access-rdbnh") pod "e60d545e-d480-44f7-8c67-bba9975dd402" (UID: "e60d545e-d480-44f7-8c67-bba9975dd402"). InnerVolumeSpecName "kube-api-access-rdbnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.223378 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-util" (OuterVolumeSpecName: "util") pod "e60d545e-d480-44f7-8c67-bba9975dd402" (UID: "e60d545e-d480-44f7-8c67-bba9975dd402"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.295990 4842 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.296083 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdbnh\" (UniqueName: \"kubernetes.io/projected/e60d545e-d480-44f7-8c67-bba9975dd402-kube-api-access-rdbnh\") on node \"crc\" DevicePath \"\"" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.296148 4842 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e60d545e-d480-44f7-8c67-bba9975dd402-util\") on node \"crc\" DevicePath \"\"" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.791893 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" event={"ID":"e60d545e-d480-44f7-8c67-bba9975dd402","Type":"ContainerDied","Data":"81f095ce0069615a53da5ce14da9e6f5c5fd75a1520860dc4b60c1723c429194"} Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.792192 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f095ce0069615a53da5ce14da9e6f5c5fd75a1520860dc4b60c1723c429194" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.791937 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6" Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.796257 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcnxm" event={"ID":"44630950-e31c-4734-9829-8ab16b54c9ee","Type":"ContainerStarted","Data":"3797ed8d342ae521eea264ab247642210d8230339f8ace3a9b49137f2a28fda0"} Mar 11 19:06:26 crc kubenswrapper[4842]: I0311 19:06:26.818885 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zcnxm" podStartSLOduration=2.197883974 podStartE2EDuration="4.818865019s" podCreationTimestamp="2026-03-11 19:06:22 +0000 UTC" firstStartedPulling="2026-03-11 19:06:23.769118465 +0000 UTC m=+1029.416814765" lastFinishedPulling="2026-03-11 19:06:26.39009953 +0000 UTC m=+1032.037795810" observedRunningTime="2026-03-11 19:06:26.815871111 +0000 UTC m=+1032.463567401" watchObservedRunningTime="2026-03-11 19:06:26.818865019 +0000 UTC m=+1032.466561299" Mar 11 19:06:32 crc kubenswrapper[4842]: I0311 19:06:32.819264 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:32 crc kubenswrapper[4842]: I0311 19:06:32.819982 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:32 crc kubenswrapper[4842]: I0311 19:06:32.871853 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:32 crc kubenswrapper[4842]: I0311 19:06:32.917839 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.397982 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw"] Mar 11 19:06:34 crc kubenswrapper[4842]: E0311 19:06:34.398200 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60d545e-d480-44f7-8c67-bba9975dd402" containerName="util" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.398210 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60d545e-d480-44f7-8c67-bba9975dd402" containerName="util" Mar 11 19:06:34 crc kubenswrapper[4842]: E0311 19:06:34.398235 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60d545e-d480-44f7-8c67-bba9975dd402" containerName="extract" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.398241 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60d545e-d480-44f7-8c67-bba9975dd402" containerName="extract" Mar 11 19:06:34 crc kubenswrapper[4842]: E0311 19:06:34.398250 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60d545e-d480-44f7-8c67-bba9975dd402" containerName="pull" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.398256 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60d545e-d480-44f7-8c67-bba9975dd402" containerName="pull" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.398380 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60d545e-d480-44f7-8c67-bba9975dd402" containerName="extract" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.398745 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.402090 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-n52rk" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.430886 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw"] Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.520046 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4cbn\" (UniqueName: \"kubernetes.io/projected/3aa48e36-bcbd-4033-9cee-fc43aefb1b9a-kube-api-access-x4cbn\") pod \"openstack-operator-controller-init-58577bcd48-dkjvw\" (UID: \"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a\") " pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.620867 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4cbn\" (UniqueName: \"kubernetes.io/projected/3aa48e36-bcbd-4033-9cee-fc43aefb1b9a-kube-api-access-x4cbn\") pod \"openstack-operator-controller-init-58577bcd48-dkjvw\" (UID: \"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a\") " pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.650797 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4cbn\" (UniqueName: \"kubernetes.io/projected/3aa48e36-bcbd-4033-9cee-fc43aefb1b9a-kube-api-access-x4cbn\") pod \"openstack-operator-controller-init-58577bcd48-dkjvw\" (UID: \"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a\") " pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:06:34 crc kubenswrapper[4842]: I0311 19:06:34.716781 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:06:35 crc kubenswrapper[4842]: I0311 19:06:35.145680 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw"] Mar 11 19:06:35 crc kubenswrapper[4842]: I0311 19:06:35.871983 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" event={"ID":"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a","Type":"ContainerStarted","Data":"e3d2fe1edd16831310f02c7d66d3b382863188a18ed5220c89118d8f29aa6369"} Mar 11 19:06:36 crc kubenswrapper[4842]: I0311 19:06:36.077573 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zcnxm"] Mar 11 19:06:36 crc kubenswrapper[4842]: I0311 19:06:36.078912 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zcnxm" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="registry-server" containerID="cri-o://3797ed8d342ae521eea264ab247642210d8230339f8ace3a9b49137f2a28fda0" gracePeriod=2 Mar 11 19:06:36 crc kubenswrapper[4842]: W0311 19:06:36.638618 4842 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44630950_e31c_4734_9829_8ab16b54c9ee.slice/crio-05f9cce80b6dd528659f9ae8475c26e97b1887ba5af087e7d0688cadcb778708": error while statting cgroup v2: [unable to parse /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44630950_e31c_4734_9829_8ab16b54c9ee.slice/crio-05f9cce80b6dd528659f9ae8475c26e97b1887ba5af087e7d0688cadcb778708/memory.stat: read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44630950_e31c_4734_9829_8ab16b54c9ee.slice/crio-05f9cce80b6dd528659f9ae8475c26e97b1887ba5af087e7d0688cadcb778708/memory.stat: no such device], continuing to push stats Mar 11 19:06:36 crc kubenswrapper[4842]: I0311 19:06:36.907331 4842 generic.go:334] "Generic (PLEG): container finished" podID="44630950-e31c-4734-9829-8ab16b54c9ee" containerID="3797ed8d342ae521eea264ab247642210d8230339f8ace3a9b49137f2a28fda0" exitCode=0 Mar 11 19:06:36 crc kubenswrapper[4842]: I0311 19:06:36.907649 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcnxm" event={"ID":"44630950-e31c-4734-9829-8ab16b54c9ee","Type":"ContainerDied","Data":"3797ed8d342ae521eea264ab247642210d8230339f8ace3a9b49137f2a28fda0"} Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.167511 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.295108 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-catalog-content\") pod \"44630950-e31c-4734-9829-8ab16b54c9ee\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.295249 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-utilities\") pod \"44630950-e31c-4734-9829-8ab16b54c9ee\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.295318 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hwn6\" (UniqueName: \"kubernetes.io/projected/44630950-e31c-4734-9829-8ab16b54c9ee-kube-api-access-2hwn6\") pod \"44630950-e31c-4734-9829-8ab16b54c9ee\" (UID: \"44630950-e31c-4734-9829-8ab16b54c9ee\") " Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.296939 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-utilities" (OuterVolumeSpecName: "utilities") pod "44630950-e31c-4734-9829-8ab16b54c9ee" (UID: "44630950-e31c-4734-9829-8ab16b54c9ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.305412 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44630950-e31c-4734-9829-8ab16b54c9ee-kube-api-access-2hwn6" (OuterVolumeSpecName: "kube-api-access-2hwn6") pod "44630950-e31c-4734-9829-8ab16b54c9ee" (UID: "44630950-e31c-4734-9829-8ab16b54c9ee"). InnerVolumeSpecName "kube-api-access-2hwn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.348655 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44630950-e31c-4734-9829-8ab16b54c9ee" (UID: "44630950-e31c-4734-9829-8ab16b54c9ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.396287 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.396322 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44630950-e31c-4734-9829-8ab16b54c9ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.396336 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hwn6\" (UniqueName: \"kubernetes.io/projected/44630950-e31c-4734-9829-8ab16b54c9ee-kube-api-access-2hwn6\") on node \"crc\" DevicePath \"\"" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.926619 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" event={"ID":"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a","Type":"ContainerStarted","Data":"1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b"} Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.926746 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.928844 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcnxm" event={"ID":"44630950-e31c-4734-9829-8ab16b54c9ee","Type":"ContainerDied","Data":"05f9cce80b6dd528659f9ae8475c26e97b1887ba5af087e7d0688cadcb778708"} Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.928874 4842 scope.go:117] "RemoveContainer" containerID="3797ed8d342ae521eea264ab247642210d8230339f8ace3a9b49137f2a28fda0" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.928905 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcnxm" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.947489 4842 scope.go:117] "RemoveContainer" containerID="e5599e9e18a3ef3b35860affa10bafa3681c0a8d6c8df33f2d0e08c7ad9cc809" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.965442 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" podStartSLOduration=1.71915137 podStartE2EDuration="5.965424519s" podCreationTimestamp="2026-03-11 19:06:34 +0000 UTC" firstStartedPulling="2026-03-11 19:06:35.155169465 +0000 UTC m=+1040.802865745" lastFinishedPulling="2026-03-11 19:06:39.401442574 +0000 UTC m=+1045.049138894" observedRunningTime="2026-03-11 19:06:39.95816559 +0000 UTC m=+1045.605861870" watchObservedRunningTime="2026-03-11 19:06:39.965424519 +0000 UTC m=+1045.613120809" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.976006 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zcnxm"] Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.977716 4842 scope.go:117] "RemoveContainer" containerID="fa57691dfd6e11d838ea8bfe910434e97720a406ea89401050374eb69c8883e3" Mar 11 19:06:39 crc kubenswrapper[4842]: I0311 19:06:39.980686 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zcnxm"] Mar 11 19:06:40 crc kubenswrapper[4842]: I0311 19:06:40.970366 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" path="/var/lib/kubelet/pods/44630950-e31c-4734-9829-8ab16b54c9ee/volumes" Mar 11 19:06:44 crc kubenswrapper[4842]: I0311 19:06:44.720858 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.728846 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs"] Mar 11 19:07:03 crc kubenswrapper[4842]: E0311 19:07:03.730501 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="registry-server" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.730525 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="registry-server" Mar 11 19:07:03 crc kubenswrapper[4842]: E0311 19:07:03.730549 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="extract-content" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.730557 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="extract-content" Mar 11 19:07:03 crc kubenswrapper[4842]: E0311 19:07:03.730574 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="extract-utilities" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.730587 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="extract-utilities" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.730764 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="44630950-e31c-4734-9829-8ab16b54c9ee" containerName="registry-server" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.731583 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.733401 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-f9jfd" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.733932 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.735228 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.742948 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.743217 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dgdsz" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.748606 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.780357 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.781211 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.783448 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4j9qb" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.810021 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.811458 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.814158 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7rljx" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.816559 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.839342 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.840162 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.847812 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-2xrbv" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.848876 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vbj\" (UniqueName: \"kubernetes.io/projected/cdb4f878-df19-48ad-bd71-88583edeb32a-kube-api-access-n7vbj\") pod \"cinder-operator-controller-manager-984cd4dcf-sxlvs\" (UID: \"cdb4f878-df19-48ad-bd71-88583edeb32a\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.848943 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hbjl\" (UniqueName: \"kubernetes.io/projected/a59c06c7-f7ea-4d35-9053-2d969ec7e7f9-kube-api-access-6hbjl\") pod \"barbican-operator-controller-manager-677bd678f7-n67cw\" (UID: \"a59c06c7-f7ea-4d35-9053-2d969ec7e7f9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.863882 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.871598 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.872977 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.879758 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7z5q2" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.883195 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.920298 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.921073 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.924329 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.928465 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.928852 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5d6vp" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.950547 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vbj\" (UniqueName: \"kubernetes.io/projected/cdb4f878-df19-48ad-bd71-88583edeb32a-kube-api-access-n7vbj\") pod \"cinder-operator-controller-manager-984cd4dcf-sxlvs\" (UID: \"cdb4f878-df19-48ad-bd71-88583edeb32a\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.950821 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhgb8\" (UniqueName: \"kubernetes.io/projected/de16110e-c77e-4513-b74b-86097ceb5a7d-kube-api-access-dhgb8\") pod \"glance-operator-controller-manager-5964f64c48-m8dg9\" (UID: \"de16110e-c77e-4513-b74b-86097ceb5a7d\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.950902 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hbjl\" (UniqueName: \"kubernetes.io/projected/a59c06c7-f7ea-4d35-9053-2d969ec7e7f9-kube-api-access-6hbjl\") pod \"barbican-operator-controller-manager-677bd678f7-n67cw\" (UID: \"a59c06c7-f7ea-4d35-9053-2d969ec7e7f9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.950990 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn2cd\" (UniqueName: \"kubernetes.io/projected/763d79b9-8982-4ef6-8bc7-c2378f8208f0-kube-api-access-vn2cd\") pod \"designate-operator-controller-manager-66d56f6ff4-w9hp6\" (UID: \"763d79b9-8982-4ef6-8bc7-c2378f8208f0\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.951084 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsrt\" (UniqueName: \"kubernetes.io/projected/6211c7b4-3c01-49bc-9f4e-59872605f5fe-kube-api-access-vqsrt\") pod \"heat-operator-controller-manager-77b6666d85-chrh5\" (UID: \"6211c7b4-3c01-49bc-9f4e-59872605f5fe\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.952050 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf"] Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.956528 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" Mar 11 19:07:03 crc kubenswrapper[4842]: I0311 19:07:03.959562 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-stzdk" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.017230 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.027685 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vbj\" (UniqueName: \"kubernetes.io/projected/cdb4f878-df19-48ad-bd71-88583edeb32a-kube-api-access-n7vbj\") pod \"cinder-operator-controller-manager-984cd4dcf-sxlvs\" (UID: \"cdb4f878-df19-48ad-bd71-88583edeb32a\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.032103 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hbjl\" (UniqueName: \"kubernetes.io/projected/a59c06c7-f7ea-4d35-9053-2d969ec7e7f9-kube-api-access-6hbjl\") pod \"barbican-operator-controller-manager-677bd678f7-n67cw\" (UID: \"a59c06c7-f7ea-4d35-9053-2d969ec7e7f9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.036253 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.053904 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.053985 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-822kl\" (UniqueName: \"kubernetes.io/projected/024796ba-bf60-48db-962e-5d8bf962c127-kube-api-access-822kl\") pod \"ironic-operator-controller-manager-6bbb499bbc-677rf\" (UID: \"024796ba-bf60-48db-962e-5d8bf962c127\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.054070 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqqxc\" (UniqueName: \"kubernetes.io/projected/d87344b8-890b-4457-8f09-ec98bea8300e-kube-api-access-mqqxc\") pod \"horizon-operator-controller-manager-6d9d6b584d-2vd9b\" (UID: \"d87344b8-890b-4457-8f09-ec98bea8300e\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.054095 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhgb8\" (UniqueName: \"kubernetes.io/projected/de16110e-c77e-4513-b74b-86097ceb5a7d-kube-api-access-dhgb8\") pod \"glance-operator-controller-manager-5964f64c48-m8dg9\" (UID: \"de16110e-c77e-4513-b74b-86097ceb5a7d\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.054117 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn2cd\" (UniqueName: \"kubernetes.io/projected/763d79b9-8982-4ef6-8bc7-c2378f8208f0-kube-api-access-vn2cd\") pod \"designate-operator-controller-manager-66d56f6ff4-w9hp6\" (UID: \"763d79b9-8982-4ef6-8bc7-c2378f8208f0\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.054166 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsrt\" (UniqueName: \"kubernetes.io/projected/6211c7b4-3c01-49bc-9f4e-59872605f5fe-kube-api-access-vqsrt\") pod \"heat-operator-controller-manager-77b6666d85-chrh5\" (UID: \"6211c7b4-3c01-49bc-9f4e-59872605f5fe\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.054229 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngmt\" (UniqueName: \"kubernetes.io/projected/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-kube-api-access-6ngmt\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.063640 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.072398 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.073234 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.078195 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.094679 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.105015 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7swgw" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.125663 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.126900 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.131483 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-cx7zr" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.133777 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn2cd\" (UniqueName: \"kubernetes.io/projected/763d79b9-8982-4ef6-8bc7-c2378f8208f0-kube-api-access-vn2cd\") pod \"designate-operator-controller-manager-66d56f6ff4-w9hp6\" (UID: \"763d79b9-8982-4ef6-8bc7-c2378f8208f0\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.148762 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.152319 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.153172 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.155513 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-822kl\" (UniqueName: \"kubernetes.io/projected/024796ba-bf60-48db-962e-5d8bf962c127-kube-api-access-822kl\") pod \"ironic-operator-controller-manager-6bbb499bbc-677rf\" (UID: \"024796ba-bf60-48db-962e-5d8bf962c127\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.155584 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqqxc\" (UniqueName: \"kubernetes.io/projected/d87344b8-890b-4457-8f09-ec98bea8300e-kube-api-access-mqqxc\") pod \"horizon-operator-controller-manager-6d9d6b584d-2vd9b\" (UID: \"d87344b8-890b-4457-8f09-ec98bea8300e\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.155649 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngmt\" (UniqueName: \"kubernetes.io/projected/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-kube-api-access-6ngmt\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.155674 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.155793 4842 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.155838 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert podName:80959ea3-dca7-4a95-b049-d8df7ebd0ce0 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:04.655821372 +0000 UTC m=+1070.303517652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert") pod "infra-operator-controller-manager-5995f4446f-dkj58" (UID: "80959ea3-dca7-4a95-b049-d8df7ebd0ce0") : secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.163189 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kfthh" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.164249 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsrt\" (UniqueName: \"kubernetes.io/projected/6211c7b4-3c01-49bc-9f4e-59872605f5fe-kube-api-access-vqsrt\") pod \"heat-operator-controller-manager-77b6666d85-chrh5\" (UID: \"6211c7b4-3c01-49bc-9f4e-59872605f5fe\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.164624 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.180858 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhgb8\" (UniqueName: \"kubernetes.io/projected/de16110e-c77e-4513-b74b-86097ceb5a7d-kube-api-access-dhgb8\") pod \"glance-operator-controller-manager-5964f64c48-m8dg9\" (UID: \"de16110e-c77e-4513-b74b-86097ceb5a7d\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.184827 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngmt\" (UniqueName: \"kubernetes.io/projected/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-kube-api-access-6ngmt\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.185433 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.185950 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqqxc\" (UniqueName: \"kubernetes.io/projected/d87344b8-890b-4457-8f09-ec98bea8300e-kube-api-access-mqqxc\") pod \"horizon-operator-controller-manager-6d9d6b584d-2vd9b\" (UID: \"d87344b8-890b-4457-8f09-ec98bea8300e\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.191521 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.192555 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.194235 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-822kl\" (UniqueName: \"kubernetes.io/projected/024796ba-bf60-48db-962e-5d8bf962c127-kube-api-access-822kl\") pod \"ironic-operator-controller-manager-6bbb499bbc-677rf\" (UID: \"024796ba-bf60-48db-962e-5d8bf962c127\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.197657 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.198735 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lkjk9" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.205700 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.206609 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.209025 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-xbztj" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.213371 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.214325 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.216535 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-pncd7" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.223762 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.224457 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.234142 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.245377 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.246247 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.248622 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-q6775" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.248629 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.257385 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz222\" (UniqueName: \"kubernetes.io/projected/bffda318-ec25-4b92-992b-50cf5fb2f6a5-kube-api-access-dz222\") pod \"manila-operator-controller-manager-68f45f9d9f-22vbs\" (UID: \"bffda318-ec25-4b92-992b-50cf5fb2f6a5\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.257453 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5qng\" (UniqueName: \"kubernetes.io/projected/9c018477-14f2-4729-949a-25a46eae03ef-kube-api-access-b5qng\") pod \"keystone-operator-controller-manager-684f77d66d-btk6h\" (UID: \"9c018477-14f2-4729-949a-25a46eae03ef\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.257489 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d75lv\" (UniqueName: \"kubernetes.io/projected/935542fd-daef-458a-b3fe-e2d8291d6c44-kube-api-access-d75lv\") pod \"mariadb-operator-controller-manager-658d4cdd5-gbfbf\" (UID: \"935542fd-daef-458a-b3fe-e2d8291d6c44\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.260065 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.260868 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.266637 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lbw57" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.276855 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.281302 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.282220 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.286682 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-sz976" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.290313 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.325130 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.335996 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.356622 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.357512 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358315 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5qng\" (UniqueName: \"kubernetes.io/projected/9c018477-14f2-4729-949a-25a46eae03ef-kube-api-access-b5qng\") pod \"keystone-operator-controller-manager-684f77d66d-btk6h\" (UID: \"9c018477-14f2-4729-949a-25a46eae03ef\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358355 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d75lv\" (UniqueName: \"kubernetes.io/projected/935542fd-daef-458a-b3fe-e2d8291d6c44-kube-api-access-d75lv\") pod \"mariadb-operator-controller-manager-658d4cdd5-gbfbf\" (UID: \"935542fd-daef-458a-b3fe-e2d8291d6c44\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358407 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5xk\" (UniqueName: \"kubernetes.io/projected/68ca87ac-6c44-49f1-b128-9593caa6b74c-kube-api-access-9f5xk\") pod \"nova-operator-controller-manager-67b8c8c6bd-9kcrx\" (UID: \"68ca87ac-6c44-49f1-b128-9593caa6b74c\") " pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358426 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358450 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx9bw\" (UniqueName: \"kubernetes.io/projected/efd1a4f4-f73f-425c-87e9-a63681ca5466-kube-api-access-kx9bw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-f9xmb\" (UID: \"efd1a4f4-f73f-425c-87e9-a63681ca5466\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358473 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6n5\" (UniqueName: \"kubernetes.io/projected/ab4b8857-4909-4289-888e-711796d175d8-kube-api-access-gt6n5\") pod \"neutron-operator-controller-manager-776c5696bf-qd5nx\" (UID: \"ab4b8857-4909-4289-888e-711796d175d8\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358493 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz222\" (UniqueName: \"kubernetes.io/projected/bffda318-ec25-4b92-992b-50cf5fb2f6a5-kube-api-access-dz222\") pod \"manila-operator-controller-manager-68f45f9d9f-22vbs\" (UID: \"bffda318-ec25-4b92-992b-50cf5fb2f6a5\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.358524 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtkvm\" (UniqueName: \"kubernetes.io/projected/463a4e68-9555-4065-aed2-91cdc5570602-kube-api-access-dtkvm\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.364764 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-tltmz" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.381568 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.382604 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.387718 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mx2p7" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.387742 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz222\" (UniqueName: \"kubernetes.io/projected/bffda318-ec25-4b92-992b-50cf5fb2f6a5-kube-api-access-dz222\") pod \"manila-operator-controller-manager-68f45f9d9f-22vbs\" (UID: \"bffda318-ec25-4b92-992b-50cf5fb2f6a5\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.387805 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5qng\" (UniqueName: \"kubernetes.io/projected/9c018477-14f2-4729-949a-25a46eae03ef-kube-api-access-b5qng\") pod \"keystone-operator-controller-manager-684f77d66d-btk6h\" (UID: \"9c018477-14f2-4729-949a-25a46eae03ef\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.389584 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.402429 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.408349 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d75lv\" (UniqueName: \"kubernetes.io/projected/935542fd-daef-458a-b3fe-e2d8291d6c44-kube-api-access-d75lv\") pod \"mariadb-operator-controller-manager-658d4cdd5-gbfbf\" (UID: \"935542fd-daef-458a-b3fe-e2d8291d6c44\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.414037 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.416333 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.417235 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.427203 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.427422 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7b8k5" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.434501 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.468733 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx9bw\" (UniqueName: \"kubernetes.io/projected/efd1a4f4-f73f-425c-87e9-a63681ca5466-kube-api-access-kx9bw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-f9xmb\" (UID: \"efd1a4f4-f73f-425c-87e9-a63681ca5466\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.468813 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6n5\" (UniqueName: \"kubernetes.io/projected/ab4b8857-4909-4289-888e-711796d175d8-kube-api-access-gt6n5\") pod \"neutron-operator-controller-manager-776c5696bf-qd5nx\" (UID: \"ab4b8857-4909-4289-888e-711796d175d8\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.468923 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtkvm\" (UniqueName: \"kubernetes.io/projected/463a4e68-9555-4065-aed2-91cdc5570602-kube-api-access-dtkvm\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.468963 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cjs\" (UniqueName: \"kubernetes.io/projected/829f5c10-1f4f-4e84-a6ca-eba63ae106e2-kube-api-access-56cjs\") pod \"ovn-operator-controller-manager-bbc5b68f9-m59rl\" (UID: \"829f5c10-1f4f-4e84-a6ca-eba63ae106e2\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.469016 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv24h\" (UniqueName: \"kubernetes.io/projected/868a1fe1-c01f-4a07-b8d5-2d02985cc29d-kube-api-access-rv24h\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-czq5g\" (UID: \"868a1fe1-c01f-4a07-b8d5-2d02985cc29d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.469042 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjd95\" (UniqueName: \"kubernetes.io/projected/56f77fcb-3101-446e-b070-ff1dcda13209-kube-api-access-wjd95\") pod \"swift-operator-controller-manager-677c674df7-h5ksp\" (UID: \"56f77fcb-3101-446e-b070-ff1dcda13209\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.469126 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84hct\" (UniqueName: \"kubernetes.io/projected/54acfc0e-ae41-490e-ba38-f88a427ff791-kube-api-access-84hct\") pod \"placement-operator-controller-manager-574d45c66c-xdc5p\" (UID: \"54acfc0e-ae41-490e-ba38-f88a427ff791\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.469291 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfnr\" (UniqueName: \"kubernetes.io/projected/0b197ad5-cb5c-483b-85c9-16578c56dd04-kube-api-access-wlfnr\") pod \"test-operator-controller-manager-5c5cb9c4d7-25rbp\" (UID: \"0b197ad5-cb5c-483b-85c9-16578c56dd04\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.469330 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5xk\" (UniqueName: \"kubernetes.io/projected/68ca87ac-6c44-49f1-b128-9593caa6b74c-kube-api-access-9f5xk\") pod \"nova-operator-controller-manager-67b8c8c6bd-9kcrx\" (UID: \"68ca87ac-6c44-49f1-b128-9593caa6b74c\") " pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.469370 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.469692 4842 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.469790 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert podName:463a4e68-9555-4065-aed2-91cdc5570602 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:04.969767336 +0000 UTC m=+1070.617463616 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b785bdc" (UID: "463a4e68-9555-4065-aed2-91cdc5570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.481627 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.486213 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.503875 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5xk\" (UniqueName: \"kubernetes.io/projected/68ca87ac-6c44-49f1-b128-9593caa6b74c-kube-api-access-9f5xk\") pod \"nova-operator-controller-manager-67b8c8c6bd-9kcrx\" (UID: \"68ca87ac-6c44-49f1-b128-9593caa6b74c\") " pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.517329 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.529186 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-l82rh" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.532766 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtkvm\" (UniqueName: \"kubernetes.io/projected/463a4e68-9555-4065-aed2-91cdc5570602-kube-api-access-dtkvm\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.533469 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.544859 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6n5\" (UniqueName: \"kubernetes.io/projected/ab4b8857-4909-4289-888e-711796d175d8-kube-api-access-gt6n5\") pod \"neutron-operator-controller-manager-776c5696bf-qd5nx\" (UID: \"ab4b8857-4909-4289-888e-711796d175d8\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.546243 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx9bw\" (UniqueName: \"kubernetes.io/projected/efd1a4f4-f73f-425c-87e9-a63681ca5466-kube-api-access-kx9bw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-f9xmb\" (UID: \"efd1a4f4-f73f-425c-87e9-a63681ca5466\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.550970 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.560972 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.570421 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cjs\" (UniqueName: \"kubernetes.io/projected/829f5c10-1f4f-4e84-a6ca-eba63ae106e2-kube-api-access-56cjs\") pod \"ovn-operator-controller-manager-bbc5b68f9-m59rl\" (UID: \"829f5c10-1f4f-4e84-a6ca-eba63ae106e2\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.570526 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qjxx\" (UniqueName: \"kubernetes.io/projected/a3e2f9c3-1a9b-441e-87ac-07e25d805293-kube-api-access-5qjxx\") pod \"watcher-operator-controller-manager-6dd88c6f67-hlvdg\" (UID: \"a3e2f9c3-1a9b-441e-87ac-07e25d805293\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.570575 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv24h\" (UniqueName: \"kubernetes.io/projected/868a1fe1-c01f-4a07-b8d5-2d02985cc29d-kube-api-access-rv24h\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-czq5g\" (UID: \"868a1fe1-c01f-4a07-b8d5-2d02985cc29d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.570610 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjd95\" (UniqueName: \"kubernetes.io/projected/56f77fcb-3101-446e-b070-ff1dcda13209-kube-api-access-wjd95\") pod \"swift-operator-controller-manager-677c674df7-h5ksp\" (UID: \"56f77fcb-3101-446e-b070-ff1dcda13209\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.570682 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84hct\" (UniqueName: \"kubernetes.io/projected/54acfc0e-ae41-490e-ba38-f88a427ff791-kube-api-access-84hct\") pod \"placement-operator-controller-manager-574d45c66c-xdc5p\" (UID: \"54acfc0e-ae41-490e-ba38-f88a427ff791\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.570747 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfnr\" (UniqueName: \"kubernetes.io/projected/0b197ad5-cb5c-483b-85c9-16578c56dd04-kube-api-access-wlfnr\") pod \"test-operator-controller-manager-5c5cb9c4d7-25rbp\" (UID: \"0b197ad5-cb5c-483b-85c9-16578c56dd04\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.588036 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.589637 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.598119 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.598245 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.602859 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cjs\" (UniqueName: \"kubernetes.io/projected/829f5c10-1f4f-4e84-a6ca-eba63ae106e2-kube-api-access-56cjs\") pod \"ovn-operator-controller-manager-bbc5b68f9-m59rl\" (UID: \"829f5c10-1f4f-4e84-a6ca-eba63ae106e2\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.604090 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.607309 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-24bsj" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.607506 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.607716 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.613720 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjd95\" (UniqueName: \"kubernetes.io/projected/56f77fcb-3101-446e-b070-ff1dcda13209-kube-api-access-wjd95\") pod \"swift-operator-controller-manager-677c674df7-h5ksp\" (UID: \"56f77fcb-3101-446e-b070-ff1dcda13209\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.615078 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.616979 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.617839 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.619565 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfnr\" (UniqueName: \"kubernetes.io/projected/0b197ad5-cb5c-483b-85c9-16578c56dd04-kube-api-access-wlfnr\") pod \"test-operator-controller-manager-5c5cb9c4d7-25rbp\" (UID: \"0b197ad5-cb5c-483b-85c9-16578c56dd04\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.620113 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6x4w9" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.623102 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv24h\" (UniqueName: \"kubernetes.io/projected/868a1fe1-c01f-4a07-b8d5-2d02985cc29d-kube-api-access-rv24h\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-czq5g\" (UID: \"868a1fe1-c01f-4a07-b8d5-2d02985cc29d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.630959 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84hct\" (UniqueName: \"kubernetes.io/projected/54acfc0e-ae41-490e-ba38-f88a427ff791-kube-api-access-84hct\") pod \"placement-operator-controller-manager-574d45c66c-xdc5p\" (UID: \"54acfc0e-ae41-490e-ba38-f88a427ff791\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.634571 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.670834 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.671872 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.671938 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qjxx\" (UniqueName: \"kubernetes.io/projected/a3e2f9c3-1a9b-441e-87ac-07e25d805293-kube-api-access-5qjxx\") pod \"watcher-operator-controller-manager-6dd88c6f67-hlvdg\" (UID: \"a3e2f9c3-1a9b-441e-87ac-07e25d805293\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.672342 4842 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.672380 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert podName:80959ea3-dca7-4a95-b049-d8df7ebd0ce0 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:05.672367513 +0000 UTC m=+1071.320063793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert") pod "infra-operator-controller-manager-5995f4446f-dkj58" (UID: "80959ea3-dca7-4a95-b049-d8df7ebd0ce0") : secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.679513 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.699109 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.702659 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qjxx\" (UniqueName: \"kubernetes.io/projected/a3e2f9c3-1a9b-441e-87ac-07e25d805293-kube-api-access-5qjxx\") pod \"watcher-operator-controller-manager-6dd88c6f67-hlvdg\" (UID: \"a3e2f9c3-1a9b-441e-87ac-07e25d805293\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.728005 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.766368 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.773760 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg8bc\" (UniqueName: \"kubernetes.io/projected/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-kube-api-access-fg8bc\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.773811 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.773873 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.773965 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq97t\" (UniqueName: \"kubernetes.io/projected/bfbdd09b-00b7-421c-911a-09e9720004f0-kube-api-access-mq97t\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7srdl\" (UID: \"bfbdd09b-00b7-421c-911a-09e9720004f0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.844332 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw"] Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.849340 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.875664 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq97t\" (UniqueName: \"kubernetes.io/projected/bfbdd09b-00b7-421c-911a-09e9720004f0-kube-api-access-mq97t\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7srdl\" (UID: \"bfbdd09b-00b7-421c-911a-09e9720004f0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.875732 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg8bc\" (UniqueName: \"kubernetes.io/projected/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-kube-api-access-fg8bc\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.875773 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.875808 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.875972 4842 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.876054 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:05.376029999 +0000 UTC m=+1071.023726279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.876048 4842 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.876162 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:05.376132032 +0000 UTC m=+1071.023828512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "metrics-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.900239 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg8bc\" (UniqueName: \"kubernetes.io/projected/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-kube-api-access-fg8bc\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.902153 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq97t\" (UniqueName: \"kubernetes.io/projected/bfbdd09b-00b7-421c-911a-09e9720004f0-kube-api-access-mq97t\") pod \"rabbitmq-cluster-operator-manager-668c99d594-7srdl\" (UID: \"bfbdd09b-00b7-421c-911a-09e9720004f0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.976790 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" Mar 11 19:07:04 crc kubenswrapper[4842]: I0311 19:07:04.980252 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.983207 4842 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:04 crc kubenswrapper[4842]: E0311 19:07:04.983360 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert podName:463a4e68-9555-4065-aed2-91cdc5570602 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:05.98333713 +0000 UTC m=+1071.631033410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b785bdc" (UID: "463a4e68-9555-4065-aed2-91cdc5570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.021684 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b"] Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.021717 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs"] Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.021728 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5"] Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.123783 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" event={"ID":"a59c06c7-f7ea-4d35-9053-2d969ec7e7f9","Type":"ContainerStarted","Data":"887000c374a098f756b1623bcec20eb728e8f973d13e187112d834afca3e9bc3"} Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.201493 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf"] Mar 11 19:07:05 crc kubenswrapper[4842]: W0311 19:07:05.351408 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024796ba_bf60_48db_962e_5d8bf962c127.slice/crio-dbbc5baf5a1cae3df2e44acfcc98ad3ac7e76e70e8c2a054ef6cb56ddbf1a014 WatchSource:0}: Error finding container dbbc5baf5a1cae3df2e44acfcc98ad3ac7e76e70e8c2a054ef6cb56ddbf1a014: Status 404 returned error can't find the container with id dbbc5baf5a1cae3df2e44acfcc98ad3ac7e76e70e8c2a054ef6cb56ddbf1a014 Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.397965 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.398013 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:05 crc kubenswrapper[4842]: E0311 19:07:05.398244 4842 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 19:07:05 crc kubenswrapper[4842]: E0311 19:07:05.399144 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:06.399115552 +0000 UTC m=+1072.046811832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "metrics-server-cert" not found Mar 11 19:07:05 crc kubenswrapper[4842]: E0311 19:07:05.398287 4842 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 19:07:05 crc kubenswrapper[4842]: E0311 19:07:05.399330 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:06.399316317 +0000 UTC m=+1072.047012597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "webhook-server-cert" not found Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.458611 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9"] Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.486942 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6"] Mar 11 19:07:05 crc kubenswrapper[4842]: W0311 19:07:05.489225 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde16110e_c77e_4513_b74b_86097ceb5a7d.slice/crio-9df74a86b81fdfa98c14c0b966f42f207576807654e3b02ca93ed39036f4424a WatchSource:0}: Error finding container 9df74a86b81fdfa98c14c0b966f42f207576807654e3b02ca93ed39036f4424a: Status 404 returned error can't find the container with id 9df74a86b81fdfa98c14c0b966f42f207576807654e3b02ca93ed39036f4424a Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.511597 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h"] Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.526535 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb"] Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.550243 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf"] Mar 11 19:07:05 crc kubenswrapper[4842]: W0311 19:07:05.608740 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935542fd_daef_458a_b3fe_e2d8291d6c44.slice/crio-6a5aa20ecd416fc4a24a8d22ab59ec96ac3f0b86079ee9ec11efdc3012e26b38 WatchSource:0}: Error finding container 6a5aa20ecd416fc4a24a8d22ab59ec96ac3f0b86079ee9ec11efdc3012e26b38: Status 404 returned error can't find the container with id 6a5aa20ecd416fc4a24a8d22ab59ec96ac3f0b86079ee9ec11efdc3012e26b38 Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.720677 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:05 crc kubenswrapper[4842]: E0311 19:07:05.720881 4842 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:05 crc kubenswrapper[4842]: E0311 19:07:05.720962 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert podName:80959ea3-dca7-4a95-b049-d8df7ebd0ce0 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:07.72094292 +0000 UTC m=+1073.368639200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert") pod "infra-operator-controller-manager-5995f4446f-dkj58" (UID: "80959ea3-dca7-4a95-b049-d8df7ebd0ce0") : secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.822965 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs"] Mar 11 19:07:05 crc kubenswrapper[4842]: I0311 19:07:05.974601 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx"] Mar 11 19:07:05 crc kubenswrapper[4842]: W0311 19:07:05.978211 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab4b8857_4909_4289_888e_711796d175d8.slice/crio-abab6d3ee2996fdefe9da591a30c48bbb69f0ac87f9f5921e7d0e4d107924282 WatchSource:0}: Error finding container abab6d3ee2996fdefe9da591a30c48bbb69f0ac87f9f5921e7d0e4d107924282: Status 404 returned error can't find the container with id abab6d3ee2996fdefe9da591a30c48bbb69f0ac87f9f5921e7d0e4d107924282 Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.027979 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.028175 4842 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.028234 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert podName:463a4e68-9555-4065-aed2-91cdc5570602 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:08.028217001 +0000 UTC m=+1073.675913281 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b785bdc" (UID: "463a4e68-9555-4065-aed2-91cdc5570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.055676 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g"] Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.073794 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp"] Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.106616 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx"] Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.113473 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg"] Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.154544 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" event={"ID":"efd1a4f4-f73f-425c-87e9-a63681ca5466","Type":"ContainerStarted","Data":"4550a3fc5a28df0cca801c8933f1ef6e7db1f862e49835f7c8c71a62b7f336d8"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.164304 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" event={"ID":"868a1fe1-c01f-4a07-b8d5-2d02985cc29d","Type":"ContainerStarted","Data":"e6da59a065fd70b2bef9205a4f198f617e0813ceae3e96f8b281dde26d68e020"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.165829 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" event={"ID":"bffda318-ec25-4b92-992b-50cf5fb2f6a5","Type":"ContainerStarted","Data":"dba2a5f0a7f98c8d10ef663f1bb5e1d4a82f5cbfbaf8a613d8b4b602e6bd2c9b"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.169559 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" event={"ID":"d87344b8-890b-4457-8f09-ec98bea8300e","Type":"ContainerStarted","Data":"af4d80361550e3cc50261804e2b6ca0532fd6b40b84b30a04751c6eded9a19fe"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.172408 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" event={"ID":"935542fd-daef-458a-b3fe-e2d8291d6c44","Type":"ContainerStarted","Data":"6a5aa20ecd416fc4a24a8d22ab59ec96ac3f0b86079ee9ec11efdc3012e26b38"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.174854 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl"] Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.177309 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" event={"ID":"024796ba-bf60-48db-962e-5d8bf962c127","Type":"ContainerStarted","Data":"dbbc5baf5a1cae3df2e44acfcc98ad3ac7e76e70e8c2a054ef6cb56ddbf1a014"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.178650 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" event={"ID":"de16110e-c77e-4513-b74b-86097ceb5a7d","Type":"ContainerStarted","Data":"9df74a86b81fdfa98c14c0b966f42f207576807654e3b02ca93ed39036f4424a"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.179948 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" event={"ID":"56f77fcb-3101-446e-b070-ff1dcda13209","Type":"ContainerStarted","Data":"91fe3593679b82645b7c154f3aa314ef95b9ee79ba26aa30d78548f08477f3aa"} Mar 11 19:07:06 crc kubenswrapper[4842]: W0311 19:07:06.181503 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3e2f9c3_1a9b_441e_87ac_07e25d805293.slice/crio-6b9191e838482d5025d1b22a4e037e84b7af7b8de1aaff9d1015140ec93c159e WatchSource:0}: Error finding container 6b9191e838482d5025d1b22a4e037e84b7af7b8de1aaff9d1015140ec93c159e: Status 404 returned error can't find the container with id 6b9191e838482d5025d1b22a4e037e84b7af7b8de1aaff9d1015140ec93c159e Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.181798 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" event={"ID":"9c018477-14f2-4729-949a-25a46eae03ef","Type":"ContainerStarted","Data":"6ab825494fd194402665f853b7189a7f8b8a9a4cbdfde6bee3b408c4cb529e4a"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.183455 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" event={"ID":"ab4b8857-4909-4289-888e-711796d175d8","Type":"ContainerStarted","Data":"abab6d3ee2996fdefe9da591a30c48bbb69f0ac87f9f5921e7d0e4d107924282"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.186886 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" event={"ID":"6211c7b4-3c01-49bc-9f4e-59872605f5fe","Type":"ContainerStarted","Data":"b83ea707311b45970ce44233aa369c340778323bb13f51d57f959b619dc1fcb0"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.188469 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" event={"ID":"cdb4f878-df19-48ad-bd71-88583edeb32a","Type":"ContainerStarted","Data":"a23a2314ed44c600b79be4835cef0cc5c32a0f31b0ca56157e3e9bd05812f819"} Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.189710 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" event={"ID":"763d79b9-8982-4ef6-8bc7-c2378f8208f0","Type":"ContainerStarted","Data":"227c7be7097c7b3c3773ac0a2fb3206b223d796c8e9d085d2cc4f4139814d16e"} Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.190809 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5qjxx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-hlvdg_openstack-operators(a3e2f9c3-1a9b-441e-87ac-07e25d805293): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.191058 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" event={"ID":"68ca87ac-6c44-49f1-b128-9593caa6b74c","Type":"ContainerStarted","Data":"aed4dd5203b88539dc24f182e767849968c0405846ba88aa02c279dbb82f8d63"} Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.192530 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" podUID="a3e2f9c3-1a9b-441e-87ac-07e25d805293" Mar 11 19:07:06 crc kubenswrapper[4842]: W0311 19:07:06.194783 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54acfc0e_ae41_490e_ba38_f88a427ff791.slice/crio-2415b1e87467a2589cf4c3f26078f5c02364d8706763a3cfe849774f7fa87d88 WatchSource:0}: Error finding container 2415b1e87467a2589cf4c3f26078f5c02364d8706763a3cfe849774f7fa87d88: Status 404 returned error can't find the container with id 2415b1e87467a2589cf4c3f26078f5c02364d8706763a3cfe849774f7fa87d88 Mar 11 19:07:06 crc kubenswrapper[4842]: W0311 19:07:06.203837 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod829f5c10_1f4f_4e84_a6ca_eba63ae106e2.slice/crio-c72109407691f5f41e733ec2407cc0d58eb008529bc508a6904b09422f031d9b WatchSource:0}: Error finding container c72109407691f5f41e733ec2407cc0d58eb008529bc508a6904b09422f031d9b: Status 404 returned error can't find the container with id c72109407691f5f41e733ec2407cc0d58eb008529bc508a6904b09422f031d9b Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.210331 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84hct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-xdc5p_openstack-operators(54acfc0e-ae41-490e-ba38-f88a427ff791): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.211614 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" podUID="54acfc0e-ae41-490e-ba38-f88a427ff791" Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.214243 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl"] Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.219834 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp"] Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.229617 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p"] Mar 11 19:07:06 crc kubenswrapper[4842]: W0311 19:07:06.231674 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbdd09b_00b7_421c_911a_09e9720004f0.slice/crio-328013f9e8c710bf4f32e2e0d2804770720092ce9e61881b7c2240eb184b9222 WatchSource:0}: Error finding container 328013f9e8c710bf4f32e2e0d2804770720092ce9e61881b7c2240eb184b9222: Status 404 returned error can't find the container with id 328013f9e8c710bf4f32e2e0d2804770720092ce9e61881b7c2240eb184b9222 Mar 11 19:07:06 crc kubenswrapper[4842]: W0311 19:07:06.240671 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b197ad5_cb5c_483b_85c9_16578c56dd04.slice/crio-6fa778d5f99c85bba6e30b21a1014d898e71b89af3da05025d5731d8d940f52a WatchSource:0}: Error finding container 6fa778d5f99c85bba6e30b21a1014d898e71b89af3da05025d5731d8d940f52a: Status 404 returned error can't find the container with id 6fa778d5f99c85bba6e30b21a1014d898e71b89af3da05025d5731d8d940f52a Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.246529 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mq97t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-7srdl_openstack-operators(bfbdd09b-00b7-421c-911a-09e9720004f0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.247714 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" podUID="bfbdd09b-00b7-421c-911a-09e9720004f0" Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.257913 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wlfnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-25rbp_openstack-operators(0b197ad5-cb5c-483b-85c9-16578c56dd04): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.259006 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" podUID="0b197ad5-cb5c-483b-85c9-16578c56dd04" Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.436216 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:06 crc kubenswrapper[4842]: I0311 19:07:06.436287 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.436504 4842 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.436555 4842 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.436580 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:08.436563859 +0000 UTC m=+1074.084260139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "webhook-server-cert" not found Mar 11 19:07:06 crc kubenswrapper[4842]: E0311 19:07:06.436733 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:08.436701933 +0000 UTC m=+1074.084398213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "metrics-server-cert" not found Mar 11 19:07:07 crc kubenswrapper[4842]: I0311 19:07:07.208203 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" event={"ID":"a3e2f9c3-1a9b-441e-87ac-07e25d805293","Type":"ContainerStarted","Data":"6b9191e838482d5025d1b22a4e037e84b7af7b8de1aaff9d1015140ec93c159e"} Mar 11 19:07:07 crc kubenswrapper[4842]: E0311 19:07:07.211233 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" podUID="a3e2f9c3-1a9b-441e-87ac-07e25d805293" Mar 11 19:07:07 crc kubenswrapper[4842]: I0311 19:07:07.213665 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" event={"ID":"829f5c10-1f4f-4e84-a6ca-eba63ae106e2","Type":"ContainerStarted","Data":"c72109407691f5f41e733ec2407cc0d58eb008529bc508a6904b09422f031d9b"} Mar 11 19:07:07 crc kubenswrapper[4842]: I0311 19:07:07.216045 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" event={"ID":"bfbdd09b-00b7-421c-911a-09e9720004f0","Type":"ContainerStarted","Data":"328013f9e8c710bf4f32e2e0d2804770720092ce9e61881b7c2240eb184b9222"} Mar 11 19:07:07 crc kubenswrapper[4842]: E0311 19:07:07.217902 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" podUID="bfbdd09b-00b7-421c-911a-09e9720004f0" Mar 11 19:07:07 crc kubenswrapper[4842]: I0311 19:07:07.220971 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" event={"ID":"0b197ad5-cb5c-483b-85c9-16578c56dd04","Type":"ContainerStarted","Data":"6fa778d5f99c85bba6e30b21a1014d898e71b89af3da05025d5731d8d940f52a"} Mar 11 19:07:07 crc kubenswrapper[4842]: E0311 19:07:07.224519 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" podUID="0b197ad5-cb5c-483b-85c9-16578c56dd04" Mar 11 19:07:07 crc kubenswrapper[4842]: I0311 19:07:07.229355 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" event={"ID":"54acfc0e-ae41-490e-ba38-f88a427ff791","Type":"ContainerStarted","Data":"2415b1e87467a2589cf4c3f26078f5c02364d8706763a3cfe849774f7fa87d88"} Mar 11 19:07:07 crc kubenswrapper[4842]: E0311 19:07:07.230758 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" podUID="54acfc0e-ae41-490e-ba38-f88a427ff791" Mar 11 19:07:07 crc kubenswrapper[4842]: I0311 19:07:07.755308 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:07 crc kubenswrapper[4842]: E0311 19:07:07.755515 4842 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:07 crc kubenswrapper[4842]: E0311 19:07:07.755824 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert podName:80959ea3-dca7-4a95-b049-d8df7ebd0ce0 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:11.755803305 +0000 UTC m=+1077.403499585 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert") pod "infra-operator-controller-manager-5995f4446f-dkj58" (UID: "80959ea3-dca7-4a95-b049-d8df7ebd0ce0") : secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:08 crc kubenswrapper[4842]: I0311 19:07:08.059971 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.060126 4842 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.060193 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert podName:463a4e68-9555-4065-aed2-91cdc5570602 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:12.06017443 +0000 UTC m=+1077.707870710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b785bdc" (UID: "463a4e68-9555-4065-aed2-91cdc5570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.275811 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" podUID="0b197ad5-cb5c-483b-85c9-16578c56dd04" Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.276094 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" podUID="bfbdd09b-00b7-421c-911a-09e9720004f0" Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.276136 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" podUID="54acfc0e-ae41-490e-ba38-f88a427ff791" Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.277631 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" podUID="a3e2f9c3-1a9b-441e-87ac-07e25d805293" Mar 11 19:07:08 crc kubenswrapper[4842]: I0311 19:07:08.464760 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:08 crc kubenswrapper[4842]: I0311 19:07:08.464812 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.464966 4842 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.465012 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:12.464999776 +0000 UTC m=+1078.112696056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "webhook-server-cert" not found Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.465048 4842 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 19:07:08 crc kubenswrapper[4842]: E0311 19:07:08.465067 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:12.465060318 +0000 UTC m=+1078.112756598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "metrics-server-cert" not found Mar 11 19:07:11 crc kubenswrapper[4842]: I0311 19:07:11.829107 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:11 crc kubenswrapper[4842]: E0311 19:07:11.829286 4842 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:11 crc kubenswrapper[4842]: E0311 19:07:11.829575 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert podName:80959ea3-dca7-4a95-b049-d8df7ebd0ce0 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:19.829556467 +0000 UTC m=+1085.477252747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert") pod "infra-operator-controller-manager-5995f4446f-dkj58" (UID: "80959ea3-dca7-4a95-b049-d8df7ebd0ce0") : secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:12 crc kubenswrapper[4842]: I0311 19:07:12.137324 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:12 crc kubenswrapper[4842]: E0311 19:07:12.137519 4842 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:12 crc kubenswrapper[4842]: E0311 19:07:12.137617 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert podName:463a4e68-9555-4065-aed2-91cdc5570602 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:20.137575867 +0000 UTC m=+1085.785272147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b785bdc" (UID: "463a4e68-9555-4065-aed2-91cdc5570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:12 crc kubenswrapper[4842]: I0311 19:07:12.542359 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:12 crc kubenswrapper[4842]: I0311 19:07:12.542423 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:12 crc kubenswrapper[4842]: E0311 19:07:12.542611 4842 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 19:07:12 crc kubenswrapper[4842]: E0311 19:07:12.542668 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:20.54265007 +0000 UTC m=+1086.190346340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "webhook-server-cert" not found Mar 11 19:07:12 crc kubenswrapper[4842]: E0311 19:07:12.542957 4842 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 19:07:12 crc kubenswrapper[4842]: E0311 19:07:12.543084 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:20.54302231 +0000 UTC m=+1086.190718680 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "metrics-server-cert" not found Mar 11 19:07:18 crc kubenswrapper[4842]: E0311 19:07:18.880903 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f" Mar 11 19:07:18 crc kubenswrapper[4842]: E0311 19:07:18.881550 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-56cjs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-m59rl_openstack-operators(829f5c10-1f4f-4e84-a6ca-eba63ae106e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:07:18 crc kubenswrapper[4842]: E0311 19:07:18.883573 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" podUID="829f5c10-1f4f-4e84-a6ca-eba63ae106e2" Mar 11 19:07:19 crc kubenswrapper[4842]: E0311 19:07:19.583871 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:571f369855b0891a2b14e54a4c1c5ae2fbbd5de4c8fddd48e81033aad4b26423" Mar 11 19:07:19 crc kubenswrapper[4842]: E0311 19:07:19.584053 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:571f369855b0891a2b14e54a4c1c5ae2fbbd5de4c8fddd48e81033aad4b26423,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6hbjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-677bd678f7-n67cw_openstack-operators(a59c06c7-f7ea-4d35-9053-2d969ec7e7f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:07:19 crc kubenswrapper[4842]: E0311 19:07:19.585242 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" podUID="a59c06c7-f7ea-4d35-9053-2d969ec7e7f9" Mar 11 19:07:19 crc kubenswrapper[4842]: E0311 19:07:19.848935 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:571f369855b0891a2b14e54a4c1c5ae2fbbd5de4c8fddd48e81033aad4b26423\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" podUID="a59c06c7-f7ea-4d35-9053-2d969ec7e7f9" Mar 11 19:07:19 crc kubenswrapper[4842]: E0311 19:07:19.849787 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" podUID="829f5c10-1f4f-4e84-a6ca-eba63ae106e2" Mar 11 19:07:19 crc kubenswrapper[4842]: I0311 19:07:19.872530 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:19 crc kubenswrapper[4842]: E0311 19:07:19.872680 4842 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:19 crc kubenswrapper[4842]: E0311 19:07:19.872728 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert podName:80959ea3-dca7-4a95-b049-d8df7ebd0ce0 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:35.872712071 +0000 UTC m=+1101.520408351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert") pod "infra-operator-controller-manager-5995f4446f-dkj58" (UID: "80959ea3-dca7-4a95-b049-d8df7ebd0ce0") : secret "infra-operator-webhook-server-cert" not found Mar 11 19:07:20 crc kubenswrapper[4842]: I0311 19:07:20.178222 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.178419 4842 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.178469 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert podName:463a4e68-9555-4065-aed2-91cdc5570602 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:36.178455202 +0000 UTC m=+1101.826151482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b785bdc" (UID: "463a4e68-9555-4065-aed2-91cdc5570602") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.196446 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.196603 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-822kl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6bbb499bbc-677rf_openstack-operators(024796ba-bf60-48db-962e-5d8bf962c127): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.197759 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" podUID="024796ba-bf60-48db-962e-5d8bf962c127" Mar 11 19:07:20 crc kubenswrapper[4842]: I0311 19:07:20.585586 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:20 crc kubenswrapper[4842]: I0311 19:07:20.585656 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.585778 4842 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.585811 4842 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.585849 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:36.585830705 +0000 UTC m=+1102.233526985 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "metrics-server-cert" not found Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.585875 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs podName:314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00 nodeName:}" failed. No retries permitted until 2026-03-11 19:07:36.585858346 +0000 UTC m=+1102.233554636 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs") pod "openstack-operator-controller-manager-7547d775f4-htzsf" (UID: "314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00") : secret "webhook-server-cert" not found Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.785888 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.786089 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dz222,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-68f45f9d9f-22vbs_openstack-operators(bffda318-ec25-4b92-992b-50cf5fb2f6a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.787299 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" podUID="bffda318-ec25-4b92-992b-50cf5fb2f6a5" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.853962 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" podUID="bffda318-ec25-4b92-992b-50cf5fb2f6a5" Mar 11 19:07:20 crc kubenswrapper[4842]: E0311 19:07:20.854159 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" podUID="024796ba-bf60-48db-962e-5d8bf962c127" Mar 11 19:07:22 crc kubenswrapper[4842]: E0311 19:07:22.487021 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 11 19:07:22 crc kubenswrapper[4842]: E0311 19:07:22.487225 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b5qng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-btk6h_openstack-operators(9c018477-14f2-4729-949a-25a46eae03ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:07:22 crc kubenswrapper[4842]: E0311 19:07:22.489179 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" podUID="9c018477-14f2-4729-949a-25a46eae03ef" Mar 11 19:07:22 crc kubenswrapper[4842]: E0311 19:07:22.869076 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" podUID="9c018477-14f2-4729-949a-25a46eae03ef" Mar 11 19:07:26 crc kubenswrapper[4842]: E0311 19:07:26.037535 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.111:5001/openstack-k8s-operators/nova-operator:ec4640dbf95862d5e3374245d0beb2e30bd7f950" Mar 11 19:07:26 crc kubenswrapper[4842]: E0311 19:07:26.037580 4842 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.111:5001/openstack-k8s-operators/nova-operator:ec4640dbf95862d5e3374245d0beb2e30bd7f950" Mar 11 19:07:26 crc kubenswrapper[4842]: E0311 19:07:26.037700 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.111:5001/openstack-k8s-operators/nova-operator:ec4640dbf95862d5e3374245d0beb2e30bd7f950,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9f5xk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-67b8c8c6bd-9kcrx_openstack-operators(68ca87ac-6c44-49f1-b128-9593caa6b74c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:07:26 crc kubenswrapper[4842]: E0311 19:07:26.039688 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" podUID="68ca87ac-6c44-49f1-b128-9593caa6b74c" Mar 11 19:07:26 crc kubenswrapper[4842]: E0311 19:07:26.900755 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.111:5001/openstack-k8s-operators/nova-operator:ec4640dbf95862d5e3374245d0beb2e30bd7f950\\\"\"" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" podUID="68ca87ac-6c44-49f1-b128-9593caa6b74c" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.897055 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" event={"ID":"56f77fcb-3101-446e-b070-ff1dcda13209","Type":"ContainerStarted","Data":"26b310f9c4e3cdb4eda1a82f3738604bcf7383ed5630874440000a0c5fea9fc0"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.897409 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.898106 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" event={"ID":"0b197ad5-cb5c-483b-85c9-16578c56dd04","Type":"ContainerStarted","Data":"4fc761b53f2892c73b8ef0d9ac5f9305bb199d95120a9ae70c646b8b3fa02211"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.898462 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.899595 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" event={"ID":"54acfc0e-ae41-490e-ba38-f88a427ff791","Type":"ContainerStarted","Data":"e9428811b92d49cfc486eeb84c3425a0b2c3053a000fdc7dfb0fe4f006ad5116"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.899923 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.901048 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" event={"ID":"a3e2f9c3-1a9b-441e-87ac-07e25d805293","Type":"ContainerStarted","Data":"9baca4570f04c5bda14d9686739eb2828d157e968fabbd63f7451003356e04f3"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.901404 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.902457 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" event={"ID":"cdb4f878-df19-48ad-bd71-88583edeb32a","Type":"ContainerStarted","Data":"716ee3d2279bce82dc086d997e82b82fdcb9f123197a22dc8ca81af05feddb11"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.902795 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.904100 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" event={"ID":"763d79b9-8982-4ef6-8bc7-c2378f8208f0","Type":"ContainerStarted","Data":"55489e2a8501b0a8ed31decf69c60f81d58ee22cbdc50e5161b69f23af1ab86b"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.904255 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.905215 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" event={"ID":"bfbdd09b-00b7-421c-911a-09e9720004f0","Type":"ContainerStarted","Data":"bb2e8cf79a230ace8a804c856380488b23c0e9bb17524a1b5877f342197afe8b"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.906373 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" event={"ID":"ab4b8857-4909-4289-888e-711796d175d8","Type":"ContainerStarted","Data":"ae0939dac2c3832d5fd9ae3229b33642ebff967e54d8406c81d32c023d0883c4"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.906585 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.909491 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" event={"ID":"6211c7b4-3c01-49bc-9f4e-59872605f5fe","Type":"ContainerStarted","Data":"20ca11c88eb451892f856843d9f9e3af49e20b8ff24d0683c58de3a700491089"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.910038 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.911192 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" event={"ID":"d87344b8-890b-4457-8f09-ec98bea8300e","Type":"ContainerStarted","Data":"db7e36fd953f95e550d29b6b5c319643534724d35cdbcd06ed7410e86a01d853"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.911549 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.912418 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" event={"ID":"935542fd-daef-458a-b3fe-e2d8291d6c44","Type":"ContainerStarted","Data":"2b0a1a6354b922a6fbd70a587baf111ebdf1dfe4bd322fc9c2bf83192e5275d9"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.912730 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.913597 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" event={"ID":"efd1a4f4-f73f-425c-87e9-a63681ca5466","Type":"ContainerStarted","Data":"7add0268fe13b6b380d9b789e15be54b4efba4a35052c5ebd9e23f102ed8d000"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.913930 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.914753 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" event={"ID":"de16110e-c77e-4513-b74b-86097ceb5a7d","Type":"ContainerStarted","Data":"87a5bc34e6a4a70e5c4d49ceb344db4d71263559acc01ba08ffb3af9895e0b49"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.915055 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.916388 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" event={"ID":"868a1fe1-c01f-4a07-b8d5-2d02985cc29d","Type":"ContainerStarted","Data":"9e16c52b2942ed70f83195a1b28fe670e7eb18ef2ed304d85451e4bfcf1a6033"} Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.916481 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.935630 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" podStartSLOduration=5.997376704 podStartE2EDuration="23.935614738s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.110747277 +0000 UTC m=+1071.758443557" lastFinishedPulling="2026-03-11 19:07:24.048985311 +0000 UTC m=+1089.696681591" observedRunningTime="2026-03-11 19:07:27.931893911 +0000 UTC m=+1093.579590191" watchObservedRunningTime="2026-03-11 19:07:27.935614738 +0000 UTC m=+1093.583311008" Mar 11 19:07:27 crc kubenswrapper[4842]: I0311 19:07:27.994621 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" podStartSLOduration=3.588965236 podStartE2EDuration="23.994603602s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.546653228 +0000 UTC m=+1071.194349508" lastFinishedPulling="2026-03-11 19:07:25.952291594 +0000 UTC m=+1091.599987874" observedRunningTime="2026-03-11 19:07:27.993029451 +0000 UTC m=+1093.640725741" watchObservedRunningTime="2026-03-11 19:07:27.994603602 +0000 UTC m=+1093.642299882" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.023487 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" podStartSLOduration=6.684316112 podStartE2EDuration="25.023471972s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.265972499 +0000 UTC m=+1070.913668779" lastFinishedPulling="2026-03-11 19:07:23.605128359 +0000 UTC m=+1089.252824639" observedRunningTime="2026-03-11 19:07:28.020565417 +0000 UTC m=+1093.668261697" watchObservedRunningTime="2026-03-11 19:07:28.023471972 +0000 UTC m=+1093.671168242" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.073149 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" podStartSLOduration=6.532767492 podStartE2EDuration="25.073131774s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.508316681 +0000 UTC m=+1071.156012961" lastFinishedPulling="2026-03-11 19:07:24.048680963 +0000 UTC m=+1089.696377243" observedRunningTime="2026-03-11 19:07:28.07108206 +0000 UTC m=+1093.718778370" watchObservedRunningTime="2026-03-11 19:07:28.073131774 +0000 UTC m=+1093.720828064" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.096127 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" podStartSLOduration=6.697184648 podStartE2EDuration="25.096113501s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.206135414 +0000 UTC m=+1070.853831684" lastFinishedPulling="2026-03-11 19:07:23.605064257 +0000 UTC m=+1089.252760537" observedRunningTime="2026-03-11 19:07:28.093928525 +0000 UTC m=+1093.741624805" watchObservedRunningTime="2026-03-11 19:07:28.096113501 +0000 UTC m=+1093.743809781" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.122667 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" podStartSLOduration=6.57578363 podStartE2EDuration="25.122650951s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.502199212 +0000 UTC m=+1071.149895492" lastFinishedPulling="2026-03-11 19:07:24.049066513 +0000 UTC m=+1089.696762813" observedRunningTime="2026-03-11 19:07:28.117953289 +0000 UTC m=+1093.765649569" watchObservedRunningTime="2026-03-11 19:07:28.122650951 +0000 UTC m=+1093.770347231" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.138079 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" podStartSLOduration=3.41873312 podStartE2EDuration="24.138063692s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.210181373 +0000 UTC m=+1071.857877653" lastFinishedPulling="2026-03-11 19:07:26.929511945 +0000 UTC m=+1092.577208225" observedRunningTime="2026-03-11 19:07:28.135374812 +0000 UTC m=+1093.783071092" watchObservedRunningTime="2026-03-11 19:07:28.138063692 +0000 UTC m=+1093.785759972" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.166317 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" podStartSLOduration=3.375924596 podStartE2EDuration="24.166299546s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.190674225 +0000 UTC m=+1071.838370505" lastFinishedPulling="2026-03-11 19:07:26.981049185 +0000 UTC m=+1092.628745455" observedRunningTime="2026-03-11 19:07:28.161553783 +0000 UTC m=+1093.809250053" watchObservedRunningTime="2026-03-11 19:07:28.166299546 +0000 UTC m=+1093.813995816" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.213858 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" podStartSLOduration=3.365495746 podStartE2EDuration="24.213841333s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.080367947 +0000 UTC m=+1071.728064227" lastFinishedPulling="2026-03-11 19:07:26.928713534 +0000 UTC m=+1092.576409814" observedRunningTime="2026-03-11 19:07:28.192326753 +0000 UTC m=+1093.840023033" watchObservedRunningTime="2026-03-11 19:07:28.213841333 +0000 UTC m=+1093.861537663" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.231943 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-7srdl" podStartSLOduration=3.495831495 podStartE2EDuration="24.231923463s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.246377354 +0000 UTC m=+1071.894073634" lastFinishedPulling="2026-03-11 19:07:26.982469322 +0000 UTC m=+1092.630165602" observedRunningTime="2026-03-11 19:07:28.230976588 +0000 UTC m=+1093.878672878" watchObservedRunningTime="2026-03-11 19:07:28.231923463 +0000 UTC m=+1093.879619743" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.263792 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" podStartSLOduration=5.229604725 podStartE2EDuration="25.263775981s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.981868056 +0000 UTC m=+1071.629564336" lastFinishedPulling="2026-03-11 19:07:26.016039312 +0000 UTC m=+1091.663735592" observedRunningTime="2026-03-11 19:07:28.26025162 +0000 UTC m=+1093.907947900" watchObservedRunningTime="2026-03-11 19:07:28.263775981 +0000 UTC m=+1093.911472261" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.277920 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" podStartSLOduration=3.606966354 podStartE2EDuration="24.277903699s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.257554804 +0000 UTC m=+1071.905251084" lastFinishedPulling="2026-03-11 19:07:26.928492149 +0000 UTC m=+1092.576188429" observedRunningTime="2026-03-11 19:07:28.273418022 +0000 UTC m=+1093.921114332" watchObservedRunningTime="2026-03-11 19:07:28.277903699 +0000 UTC m=+1093.925599979" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.295319 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" podStartSLOduration=4.965739003 podStartE2EDuration="25.295303511s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.622407768 +0000 UTC m=+1071.270104048" lastFinishedPulling="2026-03-11 19:07:25.951972276 +0000 UTC m=+1091.599668556" observedRunningTime="2026-03-11 19:07:28.290785454 +0000 UTC m=+1093.938481734" watchObservedRunningTime="2026-03-11 19:07:28.295303511 +0000 UTC m=+1093.942999791" Mar 11 19:07:28 crc kubenswrapper[4842]: I0311 19:07:28.336738 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" podStartSLOduration=6.608661005 podStartE2EDuration="25.336722638s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.32061783 +0000 UTC m=+1070.968314110" lastFinishedPulling="2026-03-11 19:07:24.048679443 +0000 UTC m=+1089.696375743" observedRunningTime="2026-03-11 19:07:28.334294525 +0000 UTC m=+1093.981990805" watchObservedRunningTime="2026-03-11 19:07:28.336722638 +0000 UTC m=+1093.984418918" Mar 11 19:07:31 crc kubenswrapper[4842]: I0311 19:07:31.953667 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" event={"ID":"a59c06c7-f7ea-4d35-9053-2d969ec7e7f9","Type":"ContainerStarted","Data":"74991ee33e09b2c6421e670d27a837e531f20c0f8908c3d9cfd3cc63675fade7"} Mar 11 19:07:31 crc kubenswrapper[4842]: I0311 19:07:31.954346 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" Mar 11 19:07:31 crc kubenswrapper[4842]: I0311 19:07:31.971657 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" podStartSLOduration=2.402222543 podStartE2EDuration="28.97164117s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:04.944536221 +0000 UTC m=+1070.592232501" lastFinishedPulling="2026-03-11 19:07:31.513954848 +0000 UTC m=+1097.161651128" observedRunningTime="2026-03-11 19:07:31.969645528 +0000 UTC m=+1097.617341808" watchObservedRunningTime="2026-03-11 19:07:31.97164117 +0000 UTC m=+1097.619337450" Mar 11 19:07:33 crc kubenswrapper[4842]: I0311 19:07:33.969608 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" event={"ID":"024796ba-bf60-48db-962e-5d8bf962c127","Type":"ContainerStarted","Data":"892eacc9eddced7480d86d93f6f49caec72cbdc271389ea29469650bcd4c4eaf"} Mar 11 19:07:33 crc kubenswrapper[4842]: I0311 19:07:33.970071 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" Mar 11 19:07:33 crc kubenswrapper[4842]: I0311 19:07:33.990552 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" podStartSLOduration=2.750053217 podStartE2EDuration="30.990533808s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.373427614 +0000 UTC m=+1071.021123884" lastFinishedPulling="2026-03-11 19:07:33.613908185 +0000 UTC m=+1099.261604475" observedRunningTime="2026-03-11 19:07:33.984547523 +0000 UTC m=+1099.632243813" watchObservedRunningTime="2026-03-11 19:07:33.990533808 +0000 UTC m=+1099.638230098" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.066898 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-sxlvs" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.168211 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-chrh5" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.227113 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-2vd9b" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.416584 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-w9hp6" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.436924 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.564807 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-gbfbf" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.590376 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-qd5nx" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.607892 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.682773 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-xdc5p" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.706038 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-h5ksp" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.730931 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-czq5g" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.769543 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25rbp" Mar 11 19:07:34 crc kubenswrapper[4842]: I0311 19:07:34.851843 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-hlvdg" Mar 11 19:07:35 crc kubenswrapper[4842]: I0311 19:07:35.921531 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:35 crc kubenswrapper[4842]: I0311 19:07:35.927029 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80959ea3-dca7-4a95-b049-d8df7ebd0ce0-cert\") pod \"infra-operator-controller-manager-5995f4446f-dkj58\" (UID: \"80959ea3-dca7-4a95-b049-d8df7ebd0ce0\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.047439 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5d6vp" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.056289 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.225751 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.230980 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/463a4e68-9555-4065-aed2-91cdc5570602-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b785bdc\" (UID: \"463a4e68-9555-4065-aed2-91cdc5570602\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.450376 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-q6775" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.459350 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.483836 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58"] Mar 11 19:07:36 crc kubenswrapper[4842]: W0311 19:07:36.495922 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80959ea3_dca7_4a95_b049_d8df7ebd0ce0.slice/crio-be49219f02e7d6e4dc6d18faa493339513258383b5ae90a2856602ca88964ced WatchSource:0}: Error finding container be49219f02e7d6e4dc6d18faa493339513258383b5ae90a2856602ca88964ced: Status 404 returned error can't find the container with id be49219f02e7d6e4dc6d18faa493339513258383b5ae90a2856602ca88964ced Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.632418 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.632520 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.637302 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-metrics-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.637590 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00-webhook-certs\") pod \"openstack-operator-controller-manager-7547d775f4-htzsf\" (UID: \"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00\") " pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.679368 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc"] Mar 11 19:07:36 crc kubenswrapper[4842]: W0311 19:07:36.684512 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463a4e68_9555_4065_aed2_91cdc5570602.slice/crio-5070050dbf105f5170b3303036f0f9d9ce88e9e8541a58b6cbb8762e7014f8b5 WatchSource:0}: Error finding container 5070050dbf105f5170b3303036f0f9d9ce88e9e8541a58b6cbb8762e7014f8b5: Status 404 returned error can't find the container with id 5070050dbf105f5170b3303036f0f9d9ce88e9e8541a58b6cbb8762e7014f8b5 Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.719620 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-24bsj" Mar 11 19:07:36 crc kubenswrapper[4842]: I0311 19:07:36.728348 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:37 crc kubenswrapper[4842]: I0311 19:07:37.020020 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" event={"ID":"463a4e68-9555-4065-aed2-91cdc5570602","Type":"ContainerStarted","Data":"5070050dbf105f5170b3303036f0f9d9ce88e9e8541a58b6cbb8762e7014f8b5"} Mar 11 19:07:37 crc kubenswrapper[4842]: I0311 19:07:37.022687 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" event={"ID":"80959ea3-dca7-4a95-b049-d8df7ebd0ce0","Type":"ContainerStarted","Data":"be49219f02e7d6e4dc6d18faa493339513258383b5ae90a2856602ca88964ced"} Mar 11 19:07:37 crc kubenswrapper[4842]: I0311 19:07:37.030714 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf"] Mar 11 19:07:37 crc kubenswrapper[4842]: W0311 19:07:37.031792 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod314e6ccd_fc9e_4fbd_8a69_006e5c0e6c00.slice/crio-27a34ab0d1ce30c26f0e2bcc13dcbdaeb6de0678742a7daa626453ea9fa85ca7 WatchSource:0}: Error finding container 27a34ab0d1ce30c26f0e2bcc13dcbdaeb6de0678742a7daa626453ea9fa85ca7: Status 404 returned error can't find the container with id 27a34ab0d1ce30c26f0e2bcc13dcbdaeb6de0678742a7daa626453ea9fa85ca7 Mar 11 19:07:38 crc kubenswrapper[4842]: I0311 19:07:38.035173 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" event={"ID":"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00","Type":"ContainerStarted","Data":"27a34ab0d1ce30c26f0e2bcc13dcbdaeb6de0678742a7daa626453ea9fa85ca7"} Mar 11 19:07:44 crc kubenswrapper[4842]: I0311 19:07:44.081596 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-n67cw" Mar 11 19:07:44 crc kubenswrapper[4842]: I0311 19:07:44.082175 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" event={"ID":"314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00","Type":"ContainerStarted","Data":"3dfa1e5579cdcb98d618c2f5e9f0817c5cec0f5e5a637c8fc1cf2eb4f07e7347"} Mar 11 19:07:44 crc kubenswrapper[4842]: I0311 19:07:44.327806 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-677rf" Mar 11 19:07:45 crc kubenswrapper[4842]: I0311 19:07:45.087722 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:07:45 crc kubenswrapper[4842]: I0311 19:07:45.110906 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" podStartSLOduration=41.110890968 podStartE2EDuration="41.110890968s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:07:45.107212816 +0000 UTC m=+1110.754909096" watchObservedRunningTime="2026-03-11 19:07:45.110890968 +0000 UTC m=+1110.758587248" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.107799 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" event={"ID":"9c018477-14f2-4729-949a-25a46eae03ef","Type":"ContainerStarted","Data":"0b833b73241eee5ba3788cb1d6593bea1eeccf92142a5923345c33dc26b69a62"} Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.109265 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.113115 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" event={"ID":"80959ea3-dca7-4a95-b049-d8df7ebd0ce0","Type":"ContainerStarted","Data":"e62c2963aa3e9a40639a246abcbf45527ffc03aa5caa9e2eaf4a5633f977090b"} Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.113318 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.116188 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" event={"ID":"bffda318-ec25-4b92-992b-50cf5fb2f6a5","Type":"ContainerStarted","Data":"8f61b46222bab0522aeb71eed6d6a564af1d6e86813f203ae9cda7c337d0507a"} Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.117011 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.122115 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" event={"ID":"829f5c10-1f4f-4e84-a6ca-eba63ae106e2","Type":"ContainerStarted","Data":"facbd37ba00631815d70e266c524225329b3eae7c711ea7743a2ed6298be3263"} Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.122723 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.124454 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" event={"ID":"68ca87ac-6c44-49f1-b128-9593caa6b74c","Type":"ContainerStarted","Data":"f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8"} Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.124799 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.127322 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" event={"ID":"463a4e68-9555-4065-aed2-91cdc5570602","Type":"ContainerStarted","Data":"1c255e2ec2f162e47cb87689f50c91bc9235695d5d64e18e6bbd949ba5bd2a75"} Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.127394 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.168028 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" podStartSLOduration=3.551908418 podStartE2EDuration="45.16800706s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.845391807 +0000 UTC m=+1071.493088087" lastFinishedPulling="2026-03-11 19:07:47.461490449 +0000 UTC m=+1113.109186729" observedRunningTime="2026-03-11 19:07:48.166667797 +0000 UTC m=+1113.814364077" watchObservedRunningTime="2026-03-11 19:07:48.16800706 +0000 UTC m=+1113.815703350" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.176931 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" podStartSLOduration=3.256237122 podStartE2EDuration="45.176905934s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:05.540403006 +0000 UTC m=+1071.188099296" lastFinishedPulling="2026-03-11 19:07:47.461071818 +0000 UTC m=+1113.108768108" observedRunningTime="2026-03-11 19:07:48.137590137 +0000 UTC m=+1113.785286417" watchObservedRunningTime="2026-03-11 19:07:48.176905934 +0000 UTC m=+1113.824602214" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.196946 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" podStartSLOduration=2.847213023 podStartE2EDuration="44.196929747s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.124126585 +0000 UTC m=+1071.771822865" lastFinishedPulling="2026-03-11 19:07:47.473843309 +0000 UTC m=+1113.121539589" observedRunningTime="2026-03-11 19:07:48.193343967 +0000 UTC m=+1113.841040247" watchObservedRunningTime="2026-03-11 19:07:48.196929747 +0000 UTC m=+1113.844626027" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.212445 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" podStartSLOduration=2.945602118 podStartE2EDuration="44.212426776s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:06.208119519 +0000 UTC m=+1071.855815799" lastFinishedPulling="2026-03-11 19:07:47.474944177 +0000 UTC m=+1113.122640457" observedRunningTime="2026-03-11 19:07:48.209259246 +0000 UTC m=+1113.856955526" watchObservedRunningTime="2026-03-11 19:07:48.212426776 +0000 UTC m=+1113.860123056" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.231994 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" podStartSLOduration=34.259348263 podStartE2EDuration="45.231976547s" podCreationTimestamp="2026-03-11 19:07:03 +0000 UTC" firstStartedPulling="2026-03-11 19:07:36.499395749 +0000 UTC m=+1102.147092029" lastFinishedPulling="2026-03-11 19:07:47.472024013 +0000 UTC m=+1113.119720313" observedRunningTime="2026-03-11 19:07:48.231734321 +0000 UTC m=+1113.879430601" watchObservedRunningTime="2026-03-11 19:07:48.231976547 +0000 UTC m=+1113.879672817" Mar 11 19:07:48 crc kubenswrapper[4842]: I0311 19:07:48.257539 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" podStartSLOduration=33.472350581 podStartE2EDuration="44.257522918s" podCreationTimestamp="2026-03-11 19:07:04 +0000 UTC" firstStartedPulling="2026-03-11 19:07:36.690059947 +0000 UTC m=+1102.337756227" lastFinishedPulling="2026-03-11 19:07:47.475232234 +0000 UTC m=+1113.122928564" observedRunningTime="2026-03-11 19:07:48.25677306 +0000 UTC m=+1113.904469340" watchObservedRunningTime="2026-03-11 19:07:48.257522918 +0000 UTC m=+1113.905219198" Mar 11 19:07:54 crc kubenswrapper[4842]: I0311 19:07:54.537999 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-btk6h" Mar 11 19:07:54 crc kubenswrapper[4842]: I0311 19:07:54.555092 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-22vbs" Mar 11 19:07:54 crc kubenswrapper[4842]: I0311 19:07:54.619828 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:07:54 crc kubenswrapper[4842]: I0311 19:07:54.675797 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-m59rl" Mar 11 19:07:56 crc kubenswrapper[4842]: I0311 19:07:56.067120 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-dkj58" Mar 11 19:07:56 crc kubenswrapper[4842]: I0311 19:07:56.467862 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b785bdc" Mar 11 19:07:56 crc kubenswrapper[4842]: I0311 19:07:56.735649 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7547d775f4-htzsf" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.139839 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554268-spxtq"] Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.141017 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554268-spxtq" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.148464 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.148745 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.148674 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.150891 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554268-spxtq"] Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.242483 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v45gp\" (UniqueName: \"kubernetes.io/projected/7882c743-aaef-4544-b4c5-65daeadb4fbe-kube-api-access-v45gp\") pod \"auto-csr-approver-29554268-spxtq\" (UID: \"7882c743-aaef-4544-b4c5-65daeadb4fbe\") " pod="openshift-infra/auto-csr-approver-29554268-spxtq" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.343957 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v45gp\" (UniqueName: \"kubernetes.io/projected/7882c743-aaef-4544-b4c5-65daeadb4fbe-kube-api-access-v45gp\") pod \"auto-csr-approver-29554268-spxtq\" (UID: \"7882c743-aaef-4544-b4c5-65daeadb4fbe\") " pod="openshift-infra/auto-csr-approver-29554268-spxtq" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.375217 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v45gp\" (UniqueName: \"kubernetes.io/projected/7882c743-aaef-4544-b4c5-65daeadb4fbe-kube-api-access-v45gp\") pod \"auto-csr-approver-29554268-spxtq\" (UID: \"7882c743-aaef-4544-b4c5-65daeadb4fbe\") " pod="openshift-infra/auto-csr-approver-29554268-spxtq" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.459892 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554268-spxtq" Mar 11 19:08:00 crc kubenswrapper[4842]: I0311 19:08:00.988053 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554268-spxtq"] Mar 11 19:08:01 crc kubenswrapper[4842]: I0311 19:08:01.226662 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554268-spxtq" event={"ID":"7882c743-aaef-4544-b4c5-65daeadb4fbe","Type":"ContainerStarted","Data":"684c78e88eeda5fea34563c044a6909164ddd4ec91a7ef1d7e16189dffe3f3ca"} Mar 11 19:08:01 crc kubenswrapper[4842]: I0311 19:08:01.472464 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:08:01 crc kubenswrapper[4842]: I0311 19:08:01.472559 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:08:03 crc kubenswrapper[4842]: I0311 19:08:03.255303 4842 generic.go:334] "Generic (PLEG): container finished" podID="7882c743-aaef-4544-b4c5-65daeadb4fbe" containerID="9077efd6e8c02cda76448cdcf73db16153069e6c8b272a428484e199e5a5a97a" exitCode=0 Mar 11 19:08:03 crc kubenswrapper[4842]: I0311 19:08:03.255373 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554268-spxtq" event={"ID":"7882c743-aaef-4544-b4c5-65daeadb4fbe","Type":"ContainerDied","Data":"9077efd6e8c02cda76448cdcf73db16153069e6c8b272a428484e199e5a5a97a"} Mar 11 19:08:04 crc kubenswrapper[4842]: I0311 19:08:04.599487 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554268-spxtq" Mar 11 19:08:04 crc kubenswrapper[4842]: I0311 19:08:04.717826 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v45gp\" (UniqueName: \"kubernetes.io/projected/7882c743-aaef-4544-b4c5-65daeadb4fbe-kube-api-access-v45gp\") pod \"7882c743-aaef-4544-b4c5-65daeadb4fbe\" (UID: \"7882c743-aaef-4544-b4c5-65daeadb4fbe\") " Mar 11 19:08:04 crc kubenswrapper[4842]: I0311 19:08:04.729627 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7882c743-aaef-4544-b4c5-65daeadb4fbe-kube-api-access-v45gp" (OuterVolumeSpecName: "kube-api-access-v45gp") pod "7882c743-aaef-4544-b4c5-65daeadb4fbe" (UID: "7882c743-aaef-4544-b4c5-65daeadb4fbe"). InnerVolumeSpecName "kube-api-access-v45gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:08:04 crc kubenswrapper[4842]: I0311 19:08:04.819661 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v45gp\" (UniqueName: \"kubernetes.io/projected/7882c743-aaef-4544-b4c5-65daeadb4fbe-kube-api-access-v45gp\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:05 crc kubenswrapper[4842]: I0311 19:08:05.279922 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554268-spxtq" event={"ID":"7882c743-aaef-4544-b4c5-65daeadb4fbe","Type":"ContainerDied","Data":"684c78e88eeda5fea34563c044a6909164ddd4ec91a7ef1d7e16189dffe3f3ca"} Mar 11 19:08:05 crc kubenswrapper[4842]: I0311 19:08:05.279990 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="684c78e88eeda5fea34563c044a6909164ddd4ec91a7ef1d7e16189dffe3f3ca" Mar 11 19:08:05 crc kubenswrapper[4842]: I0311 19:08:05.280127 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554268-spxtq" Mar 11 19:08:05 crc kubenswrapper[4842]: I0311 19:08:05.691680 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554262-lvj4j"] Mar 11 19:08:05 crc kubenswrapper[4842]: I0311 19:08:05.696374 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554262-lvj4j"] Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.749054 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/rabbitmq-cell1-server-0"] Mar 11 19:08:06 crc kubenswrapper[4842]: E0311 19:08:06.749357 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7882c743-aaef-4544-b4c5-65daeadb4fbe" containerName="oc" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.749369 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7882c743-aaef-4544-b4c5-65daeadb4fbe" containerName="oc" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.749534 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7882c743-aaef-4544-b4c5-65daeadb4fbe" containerName="oc" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.750245 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.753303 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-cell1-erlang-cookie" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.753980 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"openshift-service-ca.crt" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.755334 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-cell1-plugins-conf" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.755573 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-cell1-server-conf" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.755741 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"kube-root-ca.crt" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.755981 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-cell1-default-user" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.757842 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-cell1-server-dockercfg-crn28" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.765863 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-cell1-server-0"] Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850210 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e12d431f-86df-44d1-9877-3eb3c698d089-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850300 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e12d431f-86df-44d1-9877-3eb3c698d089-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850328 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850351 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850399 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850415 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kkc2\" (UniqueName: \"kubernetes.io/projected/e12d431f-86df-44d1-9877-3eb3c698d089-kube-api-access-4kkc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850434 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e12d431f-86df-44d1-9877-3eb3c698d089-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850463 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.850857 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e12d431f-86df-44d1-9877-3eb3c698d089-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.952337 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kkc2\" (UniqueName: \"kubernetes.io/projected/e12d431f-86df-44d1-9877-3eb3c698d089-kube-api-access-4kkc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.952800 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e12d431f-86df-44d1-9877-3eb3c698d089-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.952938 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.953091 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e12d431f-86df-44d1-9877-3eb3c698d089-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.953636 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.954092 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e12d431f-86df-44d1-9877-3eb3c698d089-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.954349 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e12d431f-86df-44d1-9877-3eb3c698d089-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.954521 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e12d431f-86df-44d1-9877-3eb3c698d089-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.954824 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e12d431f-86df-44d1-9877-3eb3c698d089-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.954922 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.955024 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.955695 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.955992 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.962014 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e12d431f-86df-44d1-9877-3eb3c698d089-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.962359 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e12d431f-86df-44d1-9877-3eb3c698d089-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.963973 4842 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.964022 4842 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f128f99d1ce9a11f7ae486905de7af47476c6bae3dfbc20bde17a9897aabfa13/globalmount\"" pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.964880 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e12d431f-86df-44d1-9877-3eb3c698d089-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.972792 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd5e323-3859-4747-94f7-20765fe176e2" path="/var/lib/kubelet/pods/0bd5e323-3859-4747-94f7-20765fe176e2/volumes" Mar 11 19:08:06 crc kubenswrapper[4842]: I0311 19:08:06.987233 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kkc2\" (UniqueName: \"kubernetes.io/projected/e12d431f-86df-44d1-9877-3eb3c698d089-kube-api-access-4kkc2\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.005215 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd2e9d4-1339-4cdd-8910-f669648323ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"e12d431f-86df-44d1-9877-3eb3c698d089\") " pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.069535 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/rabbitmq-notifications-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.070830 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.073939 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-notifications-erlang-cookie" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.073961 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-notifications-server-conf" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.073939 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-notifications-default-user" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.074754 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-notifications-server-dockercfg-lkgfk" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.075425 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-notifications-plugins-conf" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.084011 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-notifications-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.097334 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.159219 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.159748 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8101bb7b-9fb5-418b-b490-e465171babc5-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.159836 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8101bb7b-9fb5-418b-b490-e465171babc5-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.159861 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.159908 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8101bb7b-9fb5-418b-b490-e465171babc5-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.159932 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8101bb7b-9fb5-418b-b490-e465171babc5-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.160297 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.160397 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.160448 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6gf\" (UniqueName: \"kubernetes.io/projected/8101bb7b-9fb5-418b-b490-e465171babc5-kube-api-access-9v6gf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266299 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8101bb7b-9fb5-418b-b490-e465171babc5-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266336 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8101bb7b-9fb5-418b-b490-e465171babc5-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266392 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266453 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266492 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6gf\" (UniqueName: \"kubernetes.io/projected/8101bb7b-9fb5-418b-b490-e465171babc5-kube-api-access-9v6gf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266537 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266566 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8101bb7b-9fb5-418b-b490-e465171babc5-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266606 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8101bb7b-9fb5-418b-b490-e465171babc5-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.266621 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.267285 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.268621 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.268973 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8101bb7b-9fb5-418b-b490-e465171babc5-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.269614 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8101bb7b-9fb5-418b-b490-e465171babc5-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.275372 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8101bb7b-9fb5-418b-b490-e465171babc5-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.275498 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8101bb7b-9fb5-418b-b490-e465171babc5-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.275524 4842 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.275573 4842 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/568888c0f6b29fda54dd37633f7e89d22757b2fa2967580e2382a6a38bc99883/globalmount\"" pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.287495 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6gf\" (UniqueName: \"kubernetes.io/projected/8101bb7b-9fb5-418b-b490-e465171babc5-kube-api-access-9v6gf\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.290934 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8101bb7b-9fb5-418b-b490-e465171babc5-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.309142 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7ce28481-8adc-4663-9d78-a75c8fcc8554\") pod \"rabbitmq-notifications-server-0\" (UID: \"8101bb7b-9fb5-418b-b490-e465171babc5\") " pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.327991 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/rabbitmq-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.330373 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.334920 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-plugins-conf" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.335068 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-default-user" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.335124 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-server-dockercfg-pr7mz" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.335320 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-server-conf" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.335548 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-erlang-cookie" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.338204 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368153 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368189 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368207 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13c13109-88f5-4c0d-9c15-739f9622af9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368240 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13c13109-88f5-4c0d-9c15-739f9622af9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368265 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13c13109-88f5-4c0d-9c15-739f9622af9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368328 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368394 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrb4\" (UniqueName: \"kubernetes.io/projected/13c13109-88f5-4c0d-9c15-739f9622af9d-kube-api-access-wmrb4\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368417 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13c13109-88f5-4c0d-9c15-739f9622af9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.368433 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.408733 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.469187 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrb4\" (UniqueName: \"kubernetes.io/projected/13c13109-88f5-4c0d-9c15-739f9622af9d-kube-api-access-wmrb4\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.469714 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13c13109-88f5-4c0d-9c15-739f9622af9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.470545 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.470905 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.471621 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.471723 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13c13109-88f5-4c0d-9c15-739f9622af9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.471815 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13c13109-88f5-4c0d-9c15-739f9622af9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.471894 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13c13109-88f5-4c0d-9c15-739f9622af9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.471990 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.472059 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.470855 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.470515 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13c13109-88f5-4c0d-9c15-739f9622af9d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.473154 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13c13109-88f5-4c0d-9c15-739f9622af9d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.474873 4842 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.474910 4842 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6bfc8d083f654dc7e038915f6231f3d7281c0289d6630d569a14018c5675f1db/globalmount\"" pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.477931 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13c13109-88f5-4c0d-9c15-739f9622af9d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.478448 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13c13109-88f5-4c0d-9c15-739f9622af9d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.484197 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13c13109-88f5-4c0d-9c15-739f9622af9d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.493157 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrb4\" (UniqueName: \"kubernetes.io/projected/13c13109-88f5-4c0d-9c15-739f9622af9d-kube-api-access-wmrb4\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.526731 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8bc05b9-46f0-4ddb-b7e4-604264c42d84\") pod \"rabbitmq-server-0\" (UID: \"13c13109-88f5-4c0d-9c15-739f9622af9d\") " pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.539641 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/openstack-galera-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.578744 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.587811 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"galera-openstack-dockercfg-whn4b" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.588129 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"cert-galera-openstack-svc" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.589656 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"openstack-config-data" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.603872 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"openstack-scripts" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.604392 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"combined-ca-bundle" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.624321 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/openstack-galera-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.646703 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-cell1-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.656574 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/rabbitmq-broadcaster-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.657888 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.661105 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-broadcaster-server-conf" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.661157 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-broadcaster-server-dockercfg-mcpz9" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.661301 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-broadcaster-erlang-cookie" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.661416 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"rabbitmq-broadcaster-plugins-conf" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.661480 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"rabbitmq-broadcaster-default-user" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.665547 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-broadcaster-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.668971 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779350 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b22b349-fc5f-4da6-818f-412f7dde5f00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779398 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-plugins\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779474 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/baa6ffd5-2b78-4119-b6f1-a70465d5288d-erlang-cookie-secret\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779577 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b22b349-fc5f-4da6-818f-412f7dde5f00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779596 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779620 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzg9n\" (UniqueName: \"kubernetes.io/projected/2b22b349-fc5f-4da6-818f-412f7dde5f00-kube-api-access-qzg9n\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779685 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/baa6ffd5-2b78-4119-b6f1-a70465d5288d-pod-info\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779710 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779733 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-confd\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779907 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrsr7\" (UniqueName: \"kubernetes.io/projected/baa6ffd5-2b78-4119-b6f1-a70465d5288d-kube-api-access-mrsr7\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.779984 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/baa6ffd5-2b78-4119-b6f1-a70465d5288d-server-conf\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.780012 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-erlang-cookie\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.780061 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-70702348-7c90-4b87-ad14-96131b13c8d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-70702348-7c90-4b87-ad14-96131b13c8d8\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.780217 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b22b349-fc5f-4da6-818f-412f7dde5f00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.780253 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.780351 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5ba03d80-6514-485d-a984-27e02586330b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba03d80-6514-485d-a984-27e02586330b\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.780374 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/baa6ffd5-2b78-4119-b6f1-a70465d5288d-plugins-conf\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881402 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b22b349-fc5f-4da6-818f-412f7dde5f00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881446 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-plugins\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881471 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/baa6ffd5-2b78-4119-b6f1-a70465d5288d-erlang-cookie-secret\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881517 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881534 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b22b349-fc5f-4da6-818f-412f7dde5f00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881554 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzg9n\" (UniqueName: \"kubernetes.io/projected/2b22b349-fc5f-4da6-818f-412f7dde5f00-kube-api-access-qzg9n\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881574 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881588 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-confd\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881603 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/baa6ffd5-2b78-4119-b6f1-a70465d5288d-pod-info\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881642 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrsr7\" (UniqueName: \"kubernetes.io/projected/baa6ffd5-2b78-4119-b6f1-a70465d5288d-kube-api-access-mrsr7\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881670 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/baa6ffd5-2b78-4119-b6f1-a70465d5288d-server-conf\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881687 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-erlang-cookie\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881712 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-70702348-7c90-4b87-ad14-96131b13c8d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-70702348-7c90-4b87-ad14-96131b13c8d8\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881733 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b22b349-fc5f-4da6-818f-412f7dde5f00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881748 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881769 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5ba03d80-6514-485d-a984-27e02586330b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba03d80-6514-485d-a984-27e02586330b\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.881786 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/baa6ffd5-2b78-4119-b6f1-a70465d5288d-plugins-conf\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.882237 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-plugins\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.883381 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2b22b349-fc5f-4da6-818f-412f7dde5f00-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.883997 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-kolla-config\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.884018 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/baa6ffd5-2b78-4119-b6f1-a70465d5288d-server-conf\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.884328 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-erlang-cookie\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.885689 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.886443 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/baa6ffd5-2b78-4119-b6f1-a70465d5288d-erlang-cookie-secret\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.889930 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2b22b349-fc5f-4da6-818f-412f7dde5f00-config-data-default\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.894683 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/baa6ffd5-2b78-4119-b6f1-a70465d5288d-plugins-conf\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.895051 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b22b349-fc5f-4da6-818f-412f7dde5f00-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.896321 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/baa6ffd5-2b78-4119-b6f1-a70465d5288d-pod-info\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.896910 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b22b349-fc5f-4da6-818f-412f7dde5f00-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.897351 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/baa6ffd5-2b78-4119-b6f1-a70465d5288d-rabbitmq-confd\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.898836 4842 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.898879 4842 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5ba03d80-6514-485d-a984-27e02586330b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba03d80-6514-485d-a984-27e02586330b\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ddb3e0105dcad9a42ee4951870121840e72088580d53b6f19480d09494100f8/globalmount\"" pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.898903 4842 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.898938 4842 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-70702348-7c90-4b87-ad14-96131b13c8d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-70702348-7c90-4b87-ad14-96131b13c8d8\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5bb5dc20cc6561c05fae50d4bd94bff5ba700e130ef6b68c8683ddb1f664b515/globalmount\"" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.904938 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzg9n\" (UniqueName: \"kubernetes.io/projected/2b22b349-fc5f-4da6-818f-412f7dde5f00-kube-api-access-qzg9n\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.906492 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-notifications-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.906960 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrsr7\" (UniqueName: \"kubernetes.io/projected/baa6ffd5-2b78-4119-b6f1-a70465d5288d-kube-api-access-mrsr7\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.934303 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-server-0"] Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.964105 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-70702348-7c90-4b87-ad14-96131b13c8d8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-70702348-7c90-4b87-ad14-96131b13c8d8\") pod \"rabbitmq-broadcaster-server-0\" (UID: \"baa6ffd5-2b78-4119-b6f1-a70465d5288d\") " pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.974068 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5ba03d80-6514-485d-a984-27e02586330b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba03d80-6514-485d-a984-27e02586330b\") pod \"openstack-galera-0\" (UID: \"2b22b349-fc5f-4da6-818f-412f7dde5f00\") " pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:07 crc kubenswrapper[4842]: I0311 19:08:07.991630 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.098279 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/memcached-0"] Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.099238 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.101558 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"memcached-config-data" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.102935 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"memcached-memcached-dockercfg-vms29" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.116054 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/memcached-0"] Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.226448 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.295416 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28scp\" (UniqueName: \"kubernetes.io/projected/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-kube-api-access-28scp\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.296080 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-kolla-config\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.296139 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-config-data\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.328663 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-cell1-server-0" event={"ID":"e12d431f-86df-44d1-9877-3eb3c698d089","Type":"ContainerStarted","Data":"aac2587cb4c163be2e5b54d142bd669dd805cfc372b38b30eaca3c39c611a2d1"} Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.343502 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-server-0" event={"ID":"13c13109-88f5-4c0d-9c15-739f9622af9d","Type":"ContainerStarted","Data":"add711da06d2b727084acf1075b909e6306ecbac5714f2acc90cd8b328384bdb"} Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.375075 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"8101bb7b-9fb5-418b-b490-e465171babc5","Type":"ContainerStarted","Data":"0bd608b4939c34e7373c18d32ab21d9ab256df4a4618abaf30ed84e2f795a693"} Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.397538 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-config-data\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.397657 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28scp\" (UniqueName: \"kubernetes.io/projected/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-kube-api-access-28scp\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.397717 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-kolla-config\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.399348 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-kolla-config\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.399983 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-config-data\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.442725 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28scp\" (UniqueName: \"kubernetes.io/projected/6f57e7eb-fa53-4182-9531-a3ebcd1df17c-kube-api-access-28scp\") pod \"memcached-0\" (UID: \"6f57e7eb-fa53-4182-9531-a3ebcd1df17c\") " pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.547246 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/rabbitmq-broadcaster-server-0"] Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.719857 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/memcached-0" Mar 11 19:08:08 crc kubenswrapper[4842]: I0311 19:08:08.839583 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/openstack-galera-0"] Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.198844 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/openstack-cell1-galera-0"] Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.200319 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.202732 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"galera-openstack-cell1-dockercfg-xmqbk" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.202852 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"openstack-cell1-scripts" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.216963 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"cert-galera-openstack-cell1-svc" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.222813 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"openstack-cell1-config-data" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.246596 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/openstack-cell1-galera-0"] Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.281147 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/memcached-0"] Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.348430 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.348822 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clc6d\" (UniqueName: \"kubernetes.io/projected/0e137603-1bc4-4ccf-ba33-09993a8e6e79-kube-api-access-clc6d\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.348856 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e137603-1bc4-4ccf-ba33-09993a8e6e79-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.349514 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.349617 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e137603-1bc4-4ccf-ba33-09993a8e6e79-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.349686 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.349732 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e137603-1bc4-4ccf-ba33-09993a8e6e79-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.349831 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.384839 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-galera-0" event={"ID":"2b22b349-fc5f-4da6-818f-412f7dde5f00","Type":"ContainerStarted","Data":"f67e572685e5759c48ad5337daeb517f7f0ef48824c80c2fb17d57f28a0186c4"} Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.386246 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" event={"ID":"baa6ffd5-2b78-4119-b6f1-a70465d5288d","Type":"ContainerStarted","Data":"ddce6db8bd90a05f6392a2c60c0eac52f2dd1ed071d77406b9f801579417b4f2"} Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.387297 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/memcached-0" event={"ID":"6f57e7eb-fa53-4182-9531-a3ebcd1df17c","Type":"ContainerStarted","Data":"4b4ca0b741bacb8d72c59fdafece66ccaac35fd5ece36d9045fd80df6bc146f2"} Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.452339 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.452415 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clc6d\" (UniqueName: \"kubernetes.io/projected/0e137603-1bc4-4ccf-ba33-09993a8e6e79-kube-api-access-clc6d\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.452481 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e137603-1bc4-4ccf-ba33-09993a8e6e79-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.452508 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.453337 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e137603-1bc4-4ccf-ba33-09993a8e6e79-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.453381 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.453410 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e137603-1bc4-4ccf-ba33-09993a8e6e79-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.453445 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.454338 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e137603-1bc4-4ccf-ba33-09993a8e6e79-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.454774 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.454849 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.456030 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e137603-1bc4-4ccf-ba33-09993a8e6e79-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.458403 4842 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.458440 4842 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d2491db7cee5ce0acd146f65723b1a07652c5b351181929c4ab61b98eb634599/globalmount\"" pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.459125 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e137603-1bc4-4ccf-ba33-09993a8e6e79-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.473580 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clc6d\" (UniqueName: \"kubernetes.io/projected/0e137603-1bc4-4ccf-ba33-09993a8e6e79-kube-api-access-clc6d\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.485558 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e137603-1bc4-4ccf-ba33-09993a8e6e79-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.498436 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c4d7257-a3b9-4609-a3a2-f97f92259b25\") pod \"openstack-cell1-galera-0\" (UID: \"0e137603-1bc4-4ccf-ba33-09993a8e6e79\") " pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:09 crc kubenswrapper[4842]: I0311 19:08:09.550460 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:10 crc kubenswrapper[4842]: I0311 19:08:10.105653 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/openstack-cell1-galera-0"] Mar 11 19:08:10 crc kubenswrapper[4842]: W0311 19:08:10.127183 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e137603_1bc4_4ccf_ba33_09993a8e6e79.slice/crio-4040265a4d08e3a73a3b4e4f6af7f0cd68c48ed431a5ab6c354072af9ef9745d WatchSource:0}: Error finding container 4040265a4d08e3a73a3b4e4f6af7f0cd68c48ed431a5ab6c354072af9ef9745d: Status 404 returned error can't find the container with id 4040265a4d08e3a73a3b4e4f6af7f0cd68c48ed431a5ab6c354072af9ef9745d Mar 11 19:08:10 crc kubenswrapper[4842]: I0311 19:08:10.401324 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-cell1-galera-0" event={"ID":"0e137603-1bc4-4ccf-ba33-09993a8e6e79","Type":"ContainerStarted","Data":"4040265a4d08e3a73a3b4e4f6af7f0cd68c48ed431a5ab6c354072af9ef9745d"} Mar 11 19:08:20 crc kubenswrapper[4842]: E0311 19:08:20.477414 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 11 19:08:20 crc kubenswrapper[4842]: E0311 19:08:20.478267 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kkc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_nova-kuttl-default(e12d431f-86df-44d1-9877-3eb3c698d089): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:08:20 crc kubenswrapper[4842]: E0311 19:08:20.479576 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="nova-kuttl-default/rabbitmq-cell1-server-0" podUID="e12d431f-86df-44d1-9877-3eb3c698d089" Mar 11 19:08:21 crc kubenswrapper[4842]: E0311 19:08:21.347377 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 11 19:08:21 crc kubenswrapper[4842]: E0311 19:08:21.348147 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9v6gf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_nova-kuttl-default(8101bb7b-9fb5-418b-b490-e465171babc5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:08:21 crc kubenswrapper[4842]: E0311 19:08:21.349427 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="nova-kuttl-default/rabbitmq-notifications-server-0" podUID="8101bb7b-9fb5-418b-b490-e465171babc5" Mar 11 19:08:21 crc kubenswrapper[4842]: E0311 19:08:21.380403 4842 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 11 19:08:21 crc kubenswrapper[4842]: E0311 19:08:21.380820 4842 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmrb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_nova-kuttl-default(13c13109-88f5-4c0d-9c15-739f9622af9d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 19:08:21 crc kubenswrapper[4842]: E0311 19:08:21.382227 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="nova-kuttl-default/rabbitmq-server-0" podUID="13c13109-88f5-4c0d-9c15-739f9622af9d" Mar 11 19:08:22 crc kubenswrapper[4842]: I0311 19:08:22.494994 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-galera-0" event={"ID":"2b22b349-fc5f-4da6-818f-412f7dde5f00","Type":"ContainerStarted","Data":"56b0d7100cd3c3cee9b11920d70915364548684b8b231a2692c5f44526e6c0a9"} Mar 11 19:08:22 crc kubenswrapper[4842]: I0311 19:08:22.497435 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/memcached-0" event={"ID":"6f57e7eb-fa53-4182-9531-a3ebcd1df17c","Type":"ContainerStarted","Data":"9e3ff2e1481af7162acad10b5187ae8ec469ac72bc0e4cd044a0a1046d7359d7"} Mar 11 19:08:22 crc kubenswrapper[4842]: I0311 19:08:22.497592 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/memcached-0" Mar 11 19:08:22 crc kubenswrapper[4842]: I0311 19:08:22.502980 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-cell1-galera-0" event={"ID":"0e137603-1bc4-4ccf-ba33-09993a8e6e79","Type":"ContainerStarted","Data":"de94f40dc13714ae75056fffeb58f3703a46d8aa8b3aceb34bbd2446a8e85e70"} Mar 11 19:08:22 crc kubenswrapper[4842]: I0311 19:08:22.557030 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/memcached-0" podStartSLOduration=2.4350725029999998 podStartE2EDuration="14.557002007s" podCreationTimestamp="2026-03-11 19:08:08 +0000 UTC" firstStartedPulling="2026-03-11 19:08:09.306142842 +0000 UTC m=+1134.953839122" lastFinishedPulling="2026-03-11 19:08:21.428072336 +0000 UTC m=+1147.075768626" observedRunningTime="2026-03-11 19:08:22.544022221 +0000 UTC m=+1148.191718521" watchObservedRunningTime="2026-03-11 19:08:22.557002007 +0000 UTC m=+1148.204698297" Mar 11 19:08:25 crc kubenswrapper[4842]: I0311 19:08:25.541598 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"8101bb7b-9fb5-418b-b490-e465171babc5","Type":"ContainerStarted","Data":"398b2c970a637deb7770c6f05b8a29a06f286e0efe98c9a092b36de1e3e05336"} Mar 11 19:08:25 crc kubenswrapper[4842]: I0311 19:08:25.545141 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" event={"ID":"baa6ffd5-2b78-4119-b6f1-a70465d5288d","Type":"ContainerStarted","Data":"5ade1946c57393611158e883a697542362da716ea0ad82d60e110cc7b4325868"} Mar 11 19:08:25 crc kubenswrapper[4842]: I0311 19:08:25.547055 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-cell1-server-0" event={"ID":"e12d431f-86df-44d1-9877-3eb3c698d089","Type":"ContainerStarted","Data":"f879c8b32f50a512684539f89273e3af49b2a0fb5fe3eb06288271a08f4d9fab"} Mar 11 19:08:25 crc kubenswrapper[4842]: I0311 19:08:25.553846 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-server-0" event={"ID":"13c13109-88f5-4c0d-9c15-739f9622af9d","Type":"ContainerStarted","Data":"b62080bc5105fad42c55f0ac02bd5c1f661941f2668ca4b4c42ce9dfbdb38f9e"} Mar 11 19:08:26 crc kubenswrapper[4842]: I0311 19:08:26.910725 4842 scope.go:117] "RemoveContainer" containerID="9d36a6b1bbfc694d6c7ebf85a1e13db7373d20cabe44b67d73e4b99b23c02bb9" Mar 11 19:08:28 crc kubenswrapper[4842]: I0311 19:08:28.583349 4842 generic.go:334] "Generic (PLEG): container finished" podID="2b22b349-fc5f-4da6-818f-412f7dde5f00" containerID="56b0d7100cd3c3cee9b11920d70915364548684b8b231a2692c5f44526e6c0a9" exitCode=0 Mar 11 19:08:28 crc kubenswrapper[4842]: I0311 19:08:28.583460 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-galera-0" event={"ID":"2b22b349-fc5f-4da6-818f-412f7dde5f00","Type":"ContainerDied","Data":"56b0d7100cd3c3cee9b11920d70915364548684b8b231a2692c5f44526e6c0a9"} Mar 11 19:08:28 crc kubenswrapper[4842]: I0311 19:08:28.589068 4842 generic.go:334] "Generic (PLEG): container finished" podID="0e137603-1bc4-4ccf-ba33-09993a8e6e79" containerID="de94f40dc13714ae75056fffeb58f3703a46d8aa8b3aceb34bbd2446a8e85e70" exitCode=0 Mar 11 19:08:28 crc kubenswrapper[4842]: I0311 19:08:28.589114 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-cell1-galera-0" event={"ID":"0e137603-1bc4-4ccf-ba33-09993a8e6e79","Type":"ContainerDied","Data":"de94f40dc13714ae75056fffeb58f3703a46d8aa8b3aceb34bbd2446a8e85e70"} Mar 11 19:08:28 crc kubenswrapper[4842]: I0311 19:08:28.727576 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/memcached-0" Mar 11 19:08:29 crc kubenswrapper[4842]: I0311 19:08:29.603215 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-cell1-galera-0" event={"ID":"0e137603-1bc4-4ccf-ba33-09993a8e6e79","Type":"ContainerStarted","Data":"5d1d0d40089c870562a75e61bdeac1284cf6a74a6558ccbdf36ec2334cdef2ea"} Mar 11 19:08:29 crc kubenswrapper[4842]: I0311 19:08:29.606970 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstack-galera-0" event={"ID":"2b22b349-fc5f-4da6-818f-412f7dde5f00","Type":"ContainerStarted","Data":"c7da8db01cd2a69250182e79ae88d242917c0fcb88ee5eb30da80aec5103fefe"} Mar 11 19:08:29 crc kubenswrapper[4842]: I0311 19:08:29.634253 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/openstack-cell1-galera-0" podStartSLOduration=10.362540993 podStartE2EDuration="21.634233435s" podCreationTimestamp="2026-03-11 19:08:08 +0000 UTC" firstStartedPulling="2026-03-11 19:08:10.132691219 +0000 UTC m=+1135.780387499" lastFinishedPulling="2026-03-11 19:08:21.404383641 +0000 UTC m=+1147.052079941" observedRunningTime="2026-03-11 19:08:29.625564187 +0000 UTC m=+1155.273260537" watchObservedRunningTime="2026-03-11 19:08:29.634233435 +0000 UTC m=+1155.281929715" Mar 11 19:08:29 crc kubenswrapper[4842]: I0311 19:08:29.663137 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/openstack-galera-0" podStartSLOduration=11.114336328 podStartE2EDuration="23.6631112s" podCreationTimestamp="2026-03-11 19:08:06 +0000 UTC" firstStartedPulling="2026-03-11 19:08:08.855960708 +0000 UTC m=+1134.503656988" lastFinishedPulling="2026-03-11 19:08:21.40473555 +0000 UTC m=+1147.052431860" observedRunningTime="2026-03-11 19:08:29.654481173 +0000 UTC m=+1155.302177493" watchObservedRunningTime="2026-03-11 19:08:29.6631112 +0000 UTC m=+1155.310807520" Mar 11 19:08:31 crc kubenswrapper[4842]: I0311 19:08:31.472126 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:08:31 crc kubenswrapper[4842]: I0311 19:08:31.472208 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:08:38 crc kubenswrapper[4842]: I0311 19:08:38.227455 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:38 crc kubenswrapper[4842]: I0311 19:08:38.228126 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:38 crc kubenswrapper[4842]: I0311 19:08:38.376931 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:38 crc kubenswrapper[4842]: I0311 19:08:38.811209 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/openstack-galera-0" Mar 11 19:08:39 crc kubenswrapper[4842]: I0311 19:08:39.552418 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:39 crc kubenswrapper[4842]: I0311 19:08:39.553517 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:39 crc kubenswrapper[4842]: I0311 19:08:39.671172 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:39 crc kubenswrapper[4842]: I0311 19:08:39.814332 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/openstack-cell1-galera-0" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.664947 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/root-account-create-update-s864f"] Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.668220 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.670869 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"openstack-mariadb-root-db-secret" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.687673 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/root-account-create-update-s864f"] Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.758236 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t65p\" (UniqueName: \"kubernetes.io/projected/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-kube-api-access-4t65p\") pod \"root-account-create-update-s864f\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.758351 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-operator-scripts\") pod \"root-account-create-update-s864f\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.859811 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t65p\" (UniqueName: \"kubernetes.io/projected/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-kube-api-access-4t65p\") pod \"root-account-create-update-s864f\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.859868 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-operator-scripts\") pod \"root-account-create-update-s864f\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.860691 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-operator-scripts\") pod \"root-account-create-update-s864f\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:46 crc kubenswrapper[4842]: I0311 19:08:46.879871 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t65p\" (UniqueName: \"kubernetes.io/projected/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-kube-api-access-4t65p\") pod \"root-account-create-update-s864f\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:47 crc kubenswrapper[4842]: I0311 19:08:47.008589 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:47 crc kubenswrapper[4842]: I0311 19:08:47.502255 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/root-account-create-update-s864f"] Mar 11 19:08:47 crc kubenswrapper[4842]: W0311 19:08:47.520129 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3f652d9_5b7b_4b4c_ae09_317de7bff3c5.slice/crio-eb1080d7195e07ee174979c7d9306bf56cc598cdef1f4507aa7ea94326bc3743 WatchSource:0}: Error finding container eb1080d7195e07ee174979c7d9306bf56cc598cdef1f4507aa7ea94326bc3743: Status 404 returned error can't find the container with id eb1080d7195e07ee174979c7d9306bf56cc598cdef1f4507aa7ea94326bc3743 Mar 11 19:08:47 crc kubenswrapper[4842]: I0311 19:08:47.794027 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/root-account-create-update-s864f" event={"ID":"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5","Type":"ContainerStarted","Data":"9ad2378cb4700afaef5d3e57ccbe6c1db255c389eb8009a2b027691e761c4e36"} Mar 11 19:08:47 crc kubenswrapper[4842]: I0311 19:08:47.794527 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/root-account-create-update-s864f" event={"ID":"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5","Type":"ContainerStarted","Data":"eb1080d7195e07ee174979c7d9306bf56cc598cdef1f4507aa7ea94326bc3743"} Mar 11 19:08:47 crc kubenswrapper[4842]: I0311 19:08:47.814847 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/root-account-create-update-s864f" podStartSLOduration=1.8148255949999998 podStartE2EDuration="1.814825595s" podCreationTimestamp="2026-03-11 19:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:08:47.810193369 +0000 UTC m=+1173.457889709" watchObservedRunningTime="2026-03-11 19:08:47.814825595 +0000 UTC m=+1173.462521885" Mar 11 19:08:47 crc kubenswrapper[4842]: I0311 19:08:47.966995 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/keystone-db-create-xwxph"] Mar 11 19:08:47 crc kubenswrapper[4842]: I0311 19:08:47.968637 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.000016 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-db-create-xwxph"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.074480 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/keystone-06c9-account-create-update-ln4k5"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.075659 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.081030 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-db-secret" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.083172 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-operator-scripts\") pod \"keystone-db-create-xwxph\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.083580 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2p9s\" (UniqueName: \"kubernetes.io/projected/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-kube-api-access-p2p9s\") pod \"keystone-db-create-xwxph\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.089482 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-06c9-account-create-update-ln4k5"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.186433 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2vx\" (UniqueName: \"kubernetes.io/projected/ddcf6712-e4ab-4aa4-848e-46de1967ef16-kube-api-access-zd2vx\") pod \"keystone-06c9-account-create-update-ln4k5\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.186929 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddcf6712-e4ab-4aa4-848e-46de1967ef16-operator-scripts\") pod \"keystone-06c9-account-create-update-ln4k5\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.187150 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-operator-scripts\") pod \"keystone-db-create-xwxph\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.187341 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2p9s\" (UniqueName: \"kubernetes.io/projected/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-kube-api-access-p2p9s\") pod \"keystone-db-create-xwxph\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.188425 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-operator-scripts\") pod \"keystone-db-create-xwxph\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.213086 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2p9s\" (UniqueName: \"kubernetes.io/projected/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-kube-api-access-p2p9s\") pod \"keystone-db-create-xwxph\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.289718 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2vx\" (UniqueName: \"kubernetes.io/projected/ddcf6712-e4ab-4aa4-848e-46de1967ef16-kube-api-access-zd2vx\") pod \"keystone-06c9-account-create-update-ln4k5\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.289904 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddcf6712-e4ab-4aa4-848e-46de1967ef16-operator-scripts\") pod \"keystone-06c9-account-create-update-ln4k5\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.290963 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddcf6712-e4ab-4aa4-848e-46de1967ef16-operator-scripts\") pod \"keystone-06c9-account-create-update-ln4k5\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.298759 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.321591 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2vx\" (UniqueName: \"kubernetes.io/projected/ddcf6712-e4ab-4aa4-848e-46de1967ef16-kube-api-access-zd2vx\") pod \"keystone-06c9-account-create-update-ln4k5\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.386252 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/placement-db-create-mwt27"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.387437 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.410603 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.425402 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-db-create-mwt27"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.478696 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/placement-2bc1-account-create-update-zklwm"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.479611 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.481894 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"placement-db-secret" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.484142 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-2bc1-account-create-update-zklwm"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.492402 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f35e50-4944-4740-a5a2-f35bfc66b4d7-operator-scripts\") pod \"placement-db-create-mwt27\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.492481 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clg8t\" (UniqueName: \"kubernetes.io/projected/51f35e50-4944-4740-a5a2-f35bfc66b4d7-kube-api-access-clg8t\") pod \"placement-db-create-mwt27\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.593795 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txsjg\" (UniqueName: \"kubernetes.io/projected/86fa2647-8583-4275-a4ed-f664fb1b1c20-kube-api-access-txsjg\") pod \"placement-2bc1-account-create-update-zklwm\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.593843 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f35e50-4944-4740-a5a2-f35bfc66b4d7-operator-scripts\") pod \"placement-db-create-mwt27\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.593921 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clg8t\" (UniqueName: \"kubernetes.io/projected/51f35e50-4944-4740-a5a2-f35bfc66b4d7-kube-api-access-clg8t\") pod \"placement-db-create-mwt27\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.593951 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86fa2647-8583-4275-a4ed-f664fb1b1c20-operator-scripts\") pod \"placement-2bc1-account-create-update-zklwm\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.595744 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f35e50-4944-4740-a5a2-f35bfc66b4d7-operator-scripts\") pod \"placement-db-create-mwt27\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.613217 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clg8t\" (UniqueName: \"kubernetes.io/projected/51f35e50-4944-4740-a5a2-f35bfc66b4d7-kube-api-access-clg8t\") pod \"placement-db-create-mwt27\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.695426 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86fa2647-8583-4275-a4ed-f664fb1b1c20-operator-scripts\") pod \"placement-2bc1-account-create-update-zklwm\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.695500 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txsjg\" (UniqueName: \"kubernetes.io/projected/86fa2647-8583-4275-a4ed-f664fb1b1c20-kube-api-access-txsjg\") pod \"placement-2bc1-account-create-update-zklwm\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.696161 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86fa2647-8583-4275-a4ed-f664fb1b1c20-operator-scripts\") pod \"placement-2bc1-account-create-update-zklwm\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.711565 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txsjg\" (UniqueName: \"kubernetes.io/projected/86fa2647-8583-4275-a4ed-f664fb1b1c20-kube-api-access-txsjg\") pod \"placement-2bc1-account-create-update-zklwm\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.782998 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.794797 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-db-create-xwxph"] Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.796533 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:48 crc kubenswrapper[4842]: W0311 19:08:48.803246 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b7de43_4cd4_4c79_bc25_f88450b0b0fa.slice/crio-0d700b34dd559c90a94eba6c3f9ce2a1f1921b6b7be6f676ef7e56a5a2b88668 WatchSource:0}: Error finding container 0d700b34dd559c90a94eba6c3f9ce2a1f1921b6b7be6f676ef7e56a5a2b88668: Status 404 returned error can't find the container with id 0d700b34dd559c90a94eba6c3f9ce2a1f1921b6b7be6f676ef7e56a5a2b88668 Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.803780 4842 generic.go:334] "Generic (PLEG): container finished" podID="f3f652d9-5b7b-4b4c-ae09-317de7bff3c5" containerID="9ad2378cb4700afaef5d3e57ccbe6c1db255c389eb8009a2b027691e761c4e36" exitCode=0 Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.803813 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/root-account-create-update-s864f" event={"ID":"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5","Type":"ContainerDied","Data":"9ad2378cb4700afaef5d3e57ccbe6c1db255c389eb8009a2b027691e761c4e36"} Mar 11 19:08:48 crc kubenswrapper[4842]: I0311 19:08:48.876581 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-06c9-account-create-update-ln4k5"] Mar 11 19:08:48 crc kubenswrapper[4842]: W0311 19:08:48.883761 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddcf6712_e4ab_4aa4_848e_46de1967ef16.slice/crio-61e655e44377f9fae827209d2b7877c17ec80d3d7e751e5f5bf455999917ae6a WatchSource:0}: Error finding container 61e655e44377f9fae827209d2b7877c17ec80d3d7e751e5f5bf455999917ae6a: Status 404 returned error can't find the container with id 61e655e44377f9fae827209d2b7877c17ec80d3d7e751e5f5bf455999917ae6a Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.259124 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-db-create-mwt27"] Mar 11 19:08:49 crc kubenswrapper[4842]: W0311 19:08:49.268560 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51f35e50_4944_4740_a5a2_f35bfc66b4d7.slice/crio-7ddc519e89e213aff406d906777a43e68d60809a93f4531df1a26731bd3f66f6 WatchSource:0}: Error finding container 7ddc519e89e213aff406d906777a43e68d60809a93f4531df1a26731bd3f66f6: Status 404 returned error can't find the container with id 7ddc519e89e213aff406d906777a43e68d60809a93f4531df1a26731bd3f66f6 Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.337874 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-2bc1-account-create-update-zklwm"] Mar 11 19:08:49 crc kubenswrapper[4842]: W0311 19:08:49.341732 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86fa2647_8583_4275_a4ed_f664fb1b1c20.slice/crio-7fc74d36af1c5e6016f58c2cd9bb07ed348becbbd52ac92f6701696d4e434e00 WatchSource:0}: Error finding container 7fc74d36af1c5e6016f58c2cd9bb07ed348becbbd52ac92f6701696d4e434e00: Status 404 returned error can't find the container with id 7fc74d36af1c5e6016f58c2cd9bb07ed348becbbd52ac92f6701696d4e434e00 Mar 11 19:08:49 crc kubenswrapper[4842]: E0311 19:08:49.515124 4842 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddcf6712_e4ab_4aa4_848e_46de1967ef16.slice/crio-2a7b06faeff9601af31d2935eccc03b55a6eaface49a0619f30007b15a26f44f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddcf6712_e4ab_4aa4_848e_46de1967ef16.slice/crio-conmon-2a7b06faeff9601af31d2935eccc03b55a6eaface49a0619f30007b15a26f44f.scope\": RecentStats: unable to find data in memory cache]" Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.827821 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-create-mwt27" event={"ID":"51f35e50-4944-4740-a5a2-f35bfc66b4d7","Type":"ContainerStarted","Data":"3015bf7bfb6a123616bf2f72563ed9e438d15f01bb1d593fc982b7198d3eef88"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.828317 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-create-mwt27" event={"ID":"51f35e50-4944-4740-a5a2-f35bfc66b4d7","Type":"ContainerStarted","Data":"7ddc519e89e213aff406d906777a43e68d60809a93f4531df1a26731bd3f66f6"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.835753 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" event={"ID":"86fa2647-8583-4275-a4ed-f664fb1b1c20","Type":"ContainerStarted","Data":"7881df410e62ea15b3042b5252e8acf2583cc8f1829451ade594c50292032a1e"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.835808 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" event={"ID":"86fa2647-8583-4275-a4ed-f664fb1b1c20","Type":"ContainerStarted","Data":"7fc74d36af1c5e6016f58c2cd9bb07ed348becbbd52ac92f6701696d4e434e00"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.838442 4842 generic.go:334] "Generic (PLEG): container finished" podID="f9b7de43-4cd4-4c79-bc25-f88450b0b0fa" containerID="f504992ca2617068de6efac7481db69261c7fffaadee27c157bdbae757b8ac09" exitCode=0 Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.838506 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-db-create-xwxph" event={"ID":"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa","Type":"ContainerDied","Data":"f504992ca2617068de6efac7481db69261c7fffaadee27c157bdbae757b8ac09"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.838529 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-db-create-xwxph" event={"ID":"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa","Type":"ContainerStarted","Data":"0d700b34dd559c90a94eba6c3f9ce2a1f1921b6b7be6f676ef7e56a5a2b88668"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.840197 4842 generic.go:334] "Generic (PLEG): container finished" podID="ddcf6712-e4ab-4aa4-848e-46de1967ef16" containerID="2a7b06faeff9601af31d2935eccc03b55a6eaface49a0619f30007b15a26f44f" exitCode=0 Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.840497 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" event={"ID":"ddcf6712-e4ab-4aa4-848e-46de1967ef16","Type":"ContainerDied","Data":"2a7b06faeff9601af31d2935eccc03b55a6eaface49a0619f30007b15a26f44f"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.840524 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" event={"ID":"ddcf6712-e4ab-4aa4-848e-46de1967ef16","Type":"ContainerStarted","Data":"61e655e44377f9fae827209d2b7877c17ec80d3d7e751e5f5bf455999917ae6a"} Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.861255 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/placement-db-create-mwt27" podStartSLOduration=1.861235545 podStartE2EDuration="1.861235545s" podCreationTimestamp="2026-03-11 19:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:08:49.852689821 +0000 UTC m=+1175.500386141" watchObservedRunningTime="2026-03-11 19:08:49.861235545 +0000 UTC m=+1175.508931825" Mar 11 19:08:49 crc kubenswrapper[4842]: I0311 19:08:49.900305 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" podStartSLOduration=1.900288496 podStartE2EDuration="1.900288496s" podCreationTimestamp="2026-03-11 19:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:08:49.895805704 +0000 UTC m=+1175.543501994" watchObservedRunningTime="2026-03-11 19:08:49.900288496 +0000 UTC m=+1175.547984776" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.128956 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.221255 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t65p\" (UniqueName: \"kubernetes.io/projected/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-kube-api-access-4t65p\") pod \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.221387 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-operator-scripts\") pod \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\" (UID: \"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5\") " Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.222512 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3f652d9-5b7b-4b4c-ae09-317de7bff3c5" (UID: "f3f652d9-5b7b-4b4c-ae09-317de7bff3c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.228718 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-kube-api-access-4t65p" (OuterVolumeSpecName: "kube-api-access-4t65p") pod "f3f652d9-5b7b-4b4c-ae09-317de7bff3c5" (UID: "f3f652d9-5b7b-4b4c-ae09-317de7bff3c5"). InnerVolumeSpecName "kube-api-access-4t65p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.323340 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t65p\" (UniqueName: \"kubernetes.io/projected/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-kube-api-access-4t65p\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.323385 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.867476 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/root-account-create-update-s864f" event={"ID":"f3f652d9-5b7b-4b4c-ae09-317de7bff3c5","Type":"ContainerDied","Data":"eb1080d7195e07ee174979c7d9306bf56cc598cdef1f4507aa7ea94326bc3743"} Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.867586 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb1080d7195e07ee174979c7d9306bf56cc598cdef1f4507aa7ea94326bc3743" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.867686 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-s864f" Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.869094 4842 generic.go:334] "Generic (PLEG): container finished" podID="51f35e50-4944-4740-a5a2-f35bfc66b4d7" containerID="3015bf7bfb6a123616bf2f72563ed9e438d15f01bb1d593fc982b7198d3eef88" exitCode=0 Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.869185 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-create-mwt27" event={"ID":"51f35e50-4944-4740-a5a2-f35bfc66b4d7","Type":"ContainerDied","Data":"3015bf7bfb6a123616bf2f72563ed9e438d15f01bb1d593fc982b7198d3eef88"} Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.871602 4842 generic.go:334] "Generic (PLEG): container finished" podID="86fa2647-8583-4275-a4ed-f664fb1b1c20" containerID="7881df410e62ea15b3042b5252e8acf2583cc8f1829451ade594c50292032a1e" exitCode=0 Mar 11 19:08:50 crc kubenswrapper[4842]: I0311 19:08:50.871687 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" event={"ID":"86fa2647-8583-4275-a4ed-f664fb1b1c20","Type":"ContainerDied","Data":"7881df410e62ea15b3042b5252e8acf2583cc8f1829451ade594c50292032a1e"} Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.249182 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.257831 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.341887 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-operator-scripts\") pod \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.341938 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd2vx\" (UniqueName: \"kubernetes.io/projected/ddcf6712-e4ab-4aa4-848e-46de1967ef16-kube-api-access-zd2vx\") pod \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.342003 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddcf6712-e4ab-4aa4-848e-46de1967ef16-operator-scripts\") pod \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\" (UID: \"ddcf6712-e4ab-4aa4-848e-46de1967ef16\") " Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.342079 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2p9s\" (UniqueName: \"kubernetes.io/projected/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-kube-api-access-p2p9s\") pod \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\" (UID: \"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa\") " Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.343322 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9b7de43-4cd4-4c79-bc25-f88450b0b0fa" (UID: "f9b7de43-4cd4-4c79-bc25-f88450b0b0fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.343414 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcf6712-e4ab-4aa4-848e-46de1967ef16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddcf6712-e4ab-4aa4-848e-46de1967ef16" (UID: "ddcf6712-e4ab-4aa4-848e-46de1967ef16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.349572 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcf6712-e4ab-4aa4-848e-46de1967ef16-kube-api-access-zd2vx" (OuterVolumeSpecName: "kube-api-access-zd2vx") pod "ddcf6712-e4ab-4aa4-848e-46de1967ef16" (UID: "ddcf6712-e4ab-4aa4-848e-46de1967ef16"). InnerVolumeSpecName "kube-api-access-zd2vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.353601 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-kube-api-access-p2p9s" (OuterVolumeSpecName: "kube-api-access-p2p9s") pod "f9b7de43-4cd4-4c79-bc25-f88450b0b0fa" (UID: "f9b7de43-4cd4-4c79-bc25-f88450b0b0fa"). InnerVolumeSpecName "kube-api-access-p2p9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.443766 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddcf6712-e4ab-4aa4-848e-46de1967ef16-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.443798 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2p9s\" (UniqueName: \"kubernetes.io/projected/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-kube-api-access-p2p9s\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.443809 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.443820 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd2vx\" (UniqueName: \"kubernetes.io/projected/ddcf6712-e4ab-4aa4-848e-46de1967ef16-kube-api-access-zd2vx\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.884529 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-create-xwxph" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.884879 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-db-create-xwxph" event={"ID":"f9b7de43-4cd4-4c79-bc25-f88450b0b0fa","Type":"ContainerDied","Data":"0d700b34dd559c90a94eba6c3f9ce2a1f1921b6b7be6f676ef7e56a5a2b88668"} Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.885148 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d700b34dd559c90a94eba6c3f9ce2a1f1921b6b7be6f676ef7e56a5a2b88668" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.886983 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" event={"ID":"ddcf6712-e4ab-4aa4-848e-46de1967ef16","Type":"ContainerDied","Data":"61e655e44377f9fae827209d2b7877c17ec80d3d7e751e5f5bf455999917ae6a"} Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.887148 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61e655e44377f9fae827209d2b7877c17ec80d3d7e751e5f5bf455999917ae6a" Mar 11 19:08:51 crc kubenswrapper[4842]: I0311 19:08:51.887064 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-06c9-account-create-update-ln4k5" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.192785 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.257872 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clg8t\" (UniqueName: \"kubernetes.io/projected/51f35e50-4944-4740-a5a2-f35bfc66b4d7-kube-api-access-clg8t\") pod \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.258071 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f35e50-4944-4740-a5a2-f35bfc66b4d7-operator-scripts\") pod \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\" (UID: \"51f35e50-4944-4740-a5a2-f35bfc66b4d7\") " Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.259324 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51f35e50-4944-4740-a5a2-f35bfc66b4d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51f35e50-4944-4740-a5a2-f35bfc66b4d7" (UID: "51f35e50-4944-4740-a5a2-f35bfc66b4d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.263559 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f35e50-4944-4740-a5a2-f35bfc66b4d7-kube-api-access-clg8t" (OuterVolumeSpecName: "kube-api-access-clg8t") pod "51f35e50-4944-4740-a5a2-f35bfc66b4d7" (UID: "51f35e50-4944-4740-a5a2-f35bfc66b4d7"). InnerVolumeSpecName "kube-api-access-clg8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.313611 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.358979 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86fa2647-8583-4275-a4ed-f664fb1b1c20-operator-scripts\") pod \"86fa2647-8583-4275-a4ed-f664fb1b1c20\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.359043 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txsjg\" (UniqueName: \"kubernetes.io/projected/86fa2647-8583-4275-a4ed-f664fb1b1c20-kube-api-access-txsjg\") pod \"86fa2647-8583-4275-a4ed-f664fb1b1c20\" (UID: \"86fa2647-8583-4275-a4ed-f664fb1b1c20\") " Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.359241 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51f35e50-4944-4740-a5a2-f35bfc66b4d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.359253 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clg8t\" (UniqueName: \"kubernetes.io/projected/51f35e50-4944-4740-a5a2-f35bfc66b4d7-kube-api-access-clg8t\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.359975 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86fa2647-8583-4275-a4ed-f664fb1b1c20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86fa2647-8583-4275-a4ed-f664fb1b1c20" (UID: "86fa2647-8583-4275-a4ed-f664fb1b1c20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.362111 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86fa2647-8583-4275-a4ed-f664fb1b1c20-kube-api-access-txsjg" (OuterVolumeSpecName: "kube-api-access-txsjg") pod "86fa2647-8583-4275-a4ed-f664fb1b1c20" (UID: "86fa2647-8583-4275-a4ed-f664fb1b1c20"). InnerVolumeSpecName "kube-api-access-txsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.461515 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86fa2647-8583-4275-a4ed-f664fb1b1c20-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.461583 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txsjg\" (UniqueName: \"kubernetes.io/projected/86fa2647-8583-4275-a4ed-f664fb1b1c20-kube-api-access-txsjg\") on node \"crc\" DevicePath \"\"" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.900551 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-create-mwt27" event={"ID":"51f35e50-4944-4740-a5a2-f35bfc66b4d7","Type":"ContainerDied","Data":"7ddc519e89e213aff406d906777a43e68d60809a93f4531df1a26731bd3f66f6"} Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.900596 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ddc519e89e213aff406d906777a43e68d60809a93f4531df1a26731bd3f66f6" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.900611 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-create-mwt27" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.902366 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" event={"ID":"86fa2647-8583-4275-a4ed-f664fb1b1c20","Type":"ContainerDied","Data":"7fc74d36af1c5e6016f58c2cd9bb07ed348becbbd52ac92f6701696d4e434e00"} Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.902384 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc74d36af1c5e6016f58c2cd9bb07ed348becbbd52ac92f6701696d4e434e00" Mar 11 19:08:52 crc kubenswrapper[4842]: I0311 19:08:52.902482 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-2bc1-account-create-update-zklwm" Mar 11 19:08:53 crc kubenswrapper[4842]: I0311 19:08:53.209908 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/root-account-create-update-s864f"] Mar 11 19:08:53 crc kubenswrapper[4842]: I0311 19:08:53.218181 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/root-account-create-update-s864f"] Mar 11 19:08:54 crc kubenswrapper[4842]: I0311 19:08:54.979825 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3f652d9-5b7b-4b4c-ae09-317de7bff3c5" path="/var/lib/kubelet/pods/f3f652d9-5b7b-4b4c-ae09-317de7bff3c5/volumes" Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.970777 4842 generic.go:334] "Generic (PLEG): container finished" podID="8101bb7b-9fb5-418b-b490-e465171babc5" containerID="398b2c970a637deb7770c6f05b8a29a06f286e0efe98c9a092b36de1e3e05336" exitCode=0 Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.970913 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"8101bb7b-9fb5-418b-b490-e465171babc5","Type":"ContainerDied","Data":"398b2c970a637deb7770c6f05b8a29a06f286e0efe98c9a092b36de1e3e05336"} Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.974236 4842 generic.go:334] "Generic (PLEG): container finished" podID="baa6ffd5-2b78-4119-b6f1-a70465d5288d" containerID="5ade1946c57393611158e883a697542362da716ea0ad82d60e110cc7b4325868" exitCode=0 Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.974420 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" event={"ID":"baa6ffd5-2b78-4119-b6f1-a70465d5288d","Type":"ContainerDied","Data":"5ade1946c57393611158e883a697542362da716ea0ad82d60e110cc7b4325868"} Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.977308 4842 generic.go:334] "Generic (PLEG): container finished" podID="e12d431f-86df-44d1-9877-3eb3c698d089" containerID="f879c8b32f50a512684539f89273e3af49b2a0fb5fe3eb06288271a08f4d9fab" exitCode=0 Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.977385 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-cell1-server-0" event={"ID":"e12d431f-86df-44d1-9877-3eb3c698d089","Type":"ContainerDied","Data":"f879c8b32f50a512684539f89273e3af49b2a0fb5fe3eb06288271a08f4d9fab"} Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.988078 4842 generic.go:334] "Generic (PLEG): container finished" podID="13c13109-88f5-4c0d-9c15-739f9622af9d" containerID="b62080bc5105fad42c55f0ac02bd5c1f661941f2668ca4b4c42ce9dfbdb38f9e" exitCode=0 Mar 11 19:08:57 crc kubenswrapper[4842]: I0311 19:08:57.988244 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-server-0" event={"ID":"13c13109-88f5-4c0d-9c15-739f9622af9d","Type":"ContainerDied","Data":"b62080bc5105fad42c55f0ac02bd5c1f661941f2668ca4b4c42ce9dfbdb38f9e"} Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.250997 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/root-account-create-update-5v6ch"] Mar 11 19:08:58 crc kubenswrapper[4842]: E0311 19:08:58.251907 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f35e50-4944-4740-a5a2-f35bfc66b4d7" containerName="mariadb-database-create" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.251924 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f35e50-4944-4740-a5a2-f35bfc66b4d7" containerName="mariadb-database-create" Mar 11 19:08:58 crc kubenswrapper[4842]: E0311 19:08:58.251948 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fa2647-8583-4275-a4ed-f664fb1b1c20" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.251975 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fa2647-8583-4275-a4ed-f664fb1b1c20" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: E0311 19:08:58.251985 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcf6712-e4ab-4aa4-848e-46de1967ef16" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.251993 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcf6712-e4ab-4aa4-848e-46de1967ef16" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: E0311 19:08:58.252013 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3f652d9-5b7b-4b4c-ae09-317de7bff3c5" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.252021 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f652d9-5b7b-4b4c-ae09-317de7bff3c5" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: E0311 19:08:58.252032 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b7de43-4cd4-4c79-bc25-f88450b0b0fa" containerName="mariadb-database-create" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.252039 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b7de43-4cd4-4c79-bc25-f88450b0b0fa" containerName="mariadb-database-create" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.252221 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b7de43-4cd4-4c79-bc25-f88450b0b0fa" containerName="mariadb-database-create" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.252231 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fa2647-8583-4275-a4ed-f664fb1b1c20" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.252245 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3f652d9-5b7b-4b4c-ae09-317de7bff3c5" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.252254 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f35e50-4944-4740-a5a2-f35bfc66b4d7" containerName="mariadb-database-create" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.252321 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcf6712-e4ab-4aa4-848e-46de1967ef16" containerName="mariadb-account-create-update" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.253125 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.256435 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"openstack-cell1-mariadb-root-db-secret" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.258533 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/root-account-create-update-5v6ch"] Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.310960 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/039d49b9-4b53-43b7-9d1b-871c543d17ed-operator-scripts\") pod \"root-account-create-update-5v6ch\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.311079 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2jq9\" (UniqueName: \"kubernetes.io/projected/039d49b9-4b53-43b7-9d1b-871c543d17ed-kube-api-access-r2jq9\") pod \"root-account-create-update-5v6ch\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.414214 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2jq9\" (UniqueName: \"kubernetes.io/projected/039d49b9-4b53-43b7-9d1b-871c543d17ed-kube-api-access-r2jq9\") pod \"root-account-create-update-5v6ch\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.414324 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/039d49b9-4b53-43b7-9d1b-871c543d17ed-operator-scripts\") pod \"root-account-create-update-5v6ch\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.415457 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/039d49b9-4b53-43b7-9d1b-871c543d17ed-operator-scripts\") pod \"root-account-create-update-5v6ch\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.434702 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2jq9\" (UniqueName: \"kubernetes.io/projected/039d49b9-4b53-43b7-9d1b-871c543d17ed-kube-api-access-r2jq9\") pod \"root-account-create-update-5v6ch\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.657151 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:08:58 crc kubenswrapper[4842]: I0311 19:08:58.978117 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/root-account-create-update-5v6ch"] Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.004480 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/root-account-create-update-5v6ch" event={"ID":"039d49b9-4b53-43b7-9d1b-871c543d17ed","Type":"ContainerStarted","Data":"87f60081c5cfa3f3f35cda1bb52fa5b2c6ae40baafcbe200674c5da1fcde4de2"} Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.007341 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-cell1-server-0" event={"ID":"e12d431f-86df-44d1-9877-3eb3c698d089","Type":"ContainerStarted","Data":"76962dc5f3314185fd5391c43b13a8b323d880eda00110f633368a4df4a49242"} Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.007579 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.025465 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-server-0" event={"ID":"13c13109-88f5-4c0d-9c15-739f9622af9d","Type":"ContainerStarted","Data":"0a753c357c4b91e7dc3b6fe2a8b26948656c102818bd848940784b5fe29616f6"} Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.025663 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.033444 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"8101bb7b-9fb5-418b-b490-e465171babc5","Type":"ContainerStarted","Data":"6c895b716682a1e761891e3b6586a93001297798c80f0e94097cb7ceb31c9125"} Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.033711 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.042099 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" event={"ID":"baa6ffd5-2b78-4119-b6f1-a70465d5288d","Type":"ContainerStarted","Data":"245251e7c1d7b9f4ed433c4f5d0830c3597341b9a4ffa3d49c7393612474b397"} Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.042305 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.069687 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/rabbitmq-notifications-server-0" podStartSLOduration=-9223371983.785105 podStartE2EDuration="53.069670624s" podCreationTimestamp="2026-03-11 19:08:06 +0000 UTC" firstStartedPulling="2026-03-11 19:08:07.919573052 +0000 UTC m=+1133.567269332" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:08:59.066125615 +0000 UTC m=+1184.713821895" watchObservedRunningTime="2026-03-11 19:08:59.069670624 +0000 UTC m=+1184.717366894" Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.071137 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/rabbitmq-cell1-server-0" podStartSLOduration=-9223371982.783644 podStartE2EDuration="54.07113165s" podCreationTimestamp="2026-03-11 19:08:05 +0000 UTC" firstStartedPulling="2026-03-11 19:08:07.57823035 +0000 UTC m=+1133.225926630" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:08:59.044124042 +0000 UTC m=+1184.691820322" watchObservedRunningTime="2026-03-11 19:08:59.07113165 +0000 UTC m=+1184.718827930" Mar 11 19:08:59 crc kubenswrapper[4842]: I0311 19:08:59.130978 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" podStartSLOduration=40.245646859 podStartE2EDuration="53.130957003s" podCreationTimestamp="2026-03-11 19:08:06 +0000 UTC" firstStartedPulling="2026-03-11 19:08:08.554389114 +0000 UTC m=+1134.202085394" lastFinishedPulling="2026-03-11 19:08:21.439699248 +0000 UTC m=+1147.087395538" observedRunningTime="2026-03-11 19:08:59.102911598 +0000 UTC m=+1184.750607878" watchObservedRunningTime="2026-03-11 19:08:59.130957003 +0000 UTC m=+1184.778653283" Mar 11 19:09:00 crc kubenswrapper[4842]: I0311 19:09:00.071127 4842 generic.go:334] "Generic (PLEG): container finished" podID="039d49b9-4b53-43b7-9d1b-871c543d17ed" containerID="e9aef4079a359c758b236029cb262ec400935a6ce589d05b12396a6477b58011" exitCode=0 Mar 11 19:09:00 crc kubenswrapper[4842]: I0311 19:09:00.071201 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/root-account-create-update-5v6ch" event={"ID":"039d49b9-4b53-43b7-9d1b-871c543d17ed","Type":"ContainerDied","Data":"e9aef4079a359c758b236029cb262ec400935a6ce589d05b12396a6477b58011"} Mar 11 19:09:00 crc kubenswrapper[4842]: I0311 19:09:00.109027 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/rabbitmq-server-0" podStartSLOduration=-9223371982.745768 podStartE2EDuration="54.109007273s" podCreationTimestamp="2026-03-11 19:08:06 +0000 UTC" firstStartedPulling="2026-03-11 19:08:07.929138783 +0000 UTC m=+1133.576835063" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:08:59.133784184 +0000 UTC m=+1184.781480464" watchObservedRunningTime="2026-03-11 19:09:00.109007273 +0000 UTC m=+1185.756703563" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.380390 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.471881 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.471985 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.472064 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.472128 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/039d49b9-4b53-43b7-9d1b-871c543d17ed-operator-scripts\") pod \"039d49b9-4b53-43b7-9d1b-871c543d17ed\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.472383 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2jq9\" (UniqueName: \"kubernetes.io/projected/039d49b9-4b53-43b7-9d1b-871c543d17ed-kube-api-access-r2jq9\") pod \"039d49b9-4b53-43b7-9d1b-871c543d17ed\" (UID: \"039d49b9-4b53-43b7-9d1b-871c543d17ed\") " Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.473009 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/039d49b9-4b53-43b7-9d1b-871c543d17ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "039d49b9-4b53-43b7-9d1b-871c543d17ed" (UID: "039d49b9-4b53-43b7-9d1b-871c543d17ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.473394 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0ede0d62e1ca5af886d8ad032f52cc79f17aa7c91031e5d4935ed627d33421d"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.473494 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://d0ede0d62e1ca5af886d8ad032f52cc79f17aa7c91031e5d4935ed627d33421d" gracePeriod=600 Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.479517 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039d49b9-4b53-43b7-9d1b-871c543d17ed-kube-api-access-r2jq9" (OuterVolumeSpecName: "kube-api-access-r2jq9") pod "039d49b9-4b53-43b7-9d1b-871c543d17ed" (UID: "039d49b9-4b53-43b7-9d1b-871c543d17ed"). InnerVolumeSpecName "kube-api-access-r2jq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.574570 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2jq9\" (UniqueName: \"kubernetes.io/projected/039d49b9-4b53-43b7-9d1b-871c543d17ed-kube-api-access-r2jq9\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:01 crc kubenswrapper[4842]: I0311 19:09:01.574624 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/039d49b9-4b53-43b7-9d1b-871c543d17ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:02 crc kubenswrapper[4842]: I0311 19:09:02.093117 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="d0ede0d62e1ca5af886d8ad032f52cc79f17aa7c91031e5d4935ed627d33421d" exitCode=0 Mar 11 19:09:02 crc kubenswrapper[4842]: I0311 19:09:02.093185 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"d0ede0d62e1ca5af886d8ad032f52cc79f17aa7c91031e5d4935ed627d33421d"} Mar 11 19:09:02 crc kubenswrapper[4842]: I0311 19:09:02.093679 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"0e0132978f744075878dba0b5cc46ab6911da7e9e6a8e99f3a4db40255e33bd4"} Mar 11 19:09:02 crc kubenswrapper[4842]: I0311 19:09:02.093700 4842 scope.go:117] "RemoveContainer" containerID="741cee34022ccdac48b6c603ba201ced3a7f7803c4c8a38143440982a01cfafb" Mar 11 19:09:02 crc kubenswrapper[4842]: I0311 19:09:02.096810 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/root-account-create-update-5v6ch" event={"ID":"039d49b9-4b53-43b7-9d1b-871c543d17ed","Type":"ContainerDied","Data":"87f60081c5cfa3f3f35cda1bb52fa5b2c6ae40baafcbe200674c5da1fcde4de2"} Mar 11 19:09:02 crc kubenswrapper[4842]: I0311 19:09:02.096847 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f60081c5cfa3f3f35cda1bb52fa5b2c6ae40baafcbe200674c5da1fcde4de2" Mar 11 19:09:02 crc kubenswrapper[4842]: I0311 19:09:02.096920 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/root-account-create-update-5v6ch" Mar 11 19:09:17 crc kubenswrapper[4842]: I0311 19:09:17.101561 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/rabbitmq-cell1-server-0" Mar 11 19:09:17 crc kubenswrapper[4842]: I0311 19:09:17.412552 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/rabbitmq-notifications-server-0" Mar 11 19:09:17 crc kubenswrapper[4842]: I0311 19:09:17.672564 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/rabbitmq-server-0" Mar 11 19:09:17 crc kubenswrapper[4842]: I0311 19:09:17.995358 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/rabbitmq-broadcaster-server-0" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.335802 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/keystone-db-sync-g42cg"] Mar 11 19:09:19 crc kubenswrapper[4842]: E0311 19:09:19.336127 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039d49b9-4b53-43b7-9d1b-871c543d17ed" containerName="mariadb-account-create-update" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.336138 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="039d49b9-4b53-43b7-9d1b-871c543d17ed" containerName="mariadb-account-create-update" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.336326 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="039d49b9-4b53-43b7-9d1b-871c543d17ed" containerName="mariadb-account-create-update" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.336776 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.339082 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-keystone-dockercfg-4p9fc" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.339211 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-scripts" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.339117 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.348549 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-config-data" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.359531 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-db-sync-g42cg"] Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.513039 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-combined-ca-bundle\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.513081 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-config-data\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.513140 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95j9f\" (UniqueName: \"kubernetes.io/projected/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-kube-api-access-95j9f\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.614758 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95j9f\" (UniqueName: \"kubernetes.io/projected/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-kube-api-access-95j9f\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.614886 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-combined-ca-bundle\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.614911 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-config-data\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.621729 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-config-data\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.633127 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-combined-ca-bundle\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.639466 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95j9f\" (UniqueName: \"kubernetes.io/projected/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-kube-api-access-95j9f\") pod \"keystone-db-sync-g42cg\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.651493 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:19 crc kubenswrapper[4842]: I0311 19:09:19.909311 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-db-sync-g42cg"] Mar 11 19:09:20 crc kubenswrapper[4842]: I0311 19:09:20.272059 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-db-sync-g42cg" event={"ID":"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb","Type":"ContainerStarted","Data":"f13e9a386e2445d3f5713000a34fac126ad9a0a0e9ad3bed8e2fb695677d7ff5"} Mar 11 19:09:26 crc kubenswrapper[4842]: I0311 19:09:26.323926 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-db-sync-g42cg" event={"ID":"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb","Type":"ContainerStarted","Data":"de126fc9d74f3844e49ae78aad19ca0ca51e5d2f1483c30709e045c2c3f82936"} Mar 11 19:09:26 crc kubenswrapper[4842]: I0311 19:09:26.339987 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/keystone-db-sync-g42cg" podStartSLOduration=1.220213356 podStartE2EDuration="7.339967308s" podCreationTimestamp="2026-03-11 19:09:19 +0000 UTC" firstStartedPulling="2026-03-11 19:09:19.917607907 +0000 UTC m=+1205.565304187" lastFinishedPulling="2026-03-11 19:09:26.037361859 +0000 UTC m=+1211.685058139" observedRunningTime="2026-03-11 19:09:26.337489346 +0000 UTC m=+1211.985185636" watchObservedRunningTime="2026-03-11 19:09:26.339967308 +0000 UTC m=+1211.987663598" Mar 11 19:09:29 crc kubenswrapper[4842]: I0311 19:09:29.352699 4842 generic.go:334] "Generic (PLEG): container finished" podID="7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" containerID="de126fc9d74f3844e49ae78aad19ca0ca51e5d2f1483c30709e045c2c3f82936" exitCode=0 Mar 11 19:09:29 crc kubenswrapper[4842]: I0311 19:09:29.352891 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-db-sync-g42cg" event={"ID":"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb","Type":"ContainerDied","Data":"de126fc9d74f3844e49ae78aad19ca0ca51e5d2f1483c30709e045c2c3f82936"} Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.655151 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.800077 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95j9f\" (UniqueName: \"kubernetes.io/projected/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-kube-api-access-95j9f\") pod \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.800229 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-config-data\") pod \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.800344 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-combined-ca-bundle\") pod \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\" (UID: \"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb\") " Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.805310 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-kube-api-access-95j9f" (OuterVolumeSpecName: "kube-api-access-95j9f") pod "7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" (UID: "7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb"). InnerVolumeSpecName "kube-api-access-95j9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.825245 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" (UID: "7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.842152 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-config-data" (OuterVolumeSpecName: "config-data") pod "7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" (UID: "7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.902527 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.902707 4842 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:30 crc kubenswrapper[4842]: I0311 19:09:30.902817 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95j9f\" (UniqueName: \"kubernetes.io/projected/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb-kube-api-access-95j9f\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.370461 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-db-sync-g42cg" event={"ID":"7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb","Type":"ContainerDied","Data":"f13e9a386e2445d3f5713000a34fac126ad9a0a0e9ad3bed8e2fb695677d7ff5"} Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.370499 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13e9a386e2445d3f5713000a34fac126ad9a0a0e9ad3bed8e2fb695677d7ff5" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.370566 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-db-sync-g42cg" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.588492 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/keystone-bootstrap-p2klm"] Mar 11 19:09:31 crc kubenswrapper[4842]: E0311 19:09:31.589198 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" containerName="keystone-db-sync" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.589220 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" containerName="keystone-db-sync" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.589427 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" containerName="keystone-db-sync" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.590034 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.592285 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"osp-secret" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.592337 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-scripts" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.594514 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-config-data" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.594553 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-keystone-dockercfg-4p9fc" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.595154 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.597982 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-p2klm"] Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.715209 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlcl7\" (UniqueName: \"kubernetes.io/projected/39093594-4bc6-479d-9e74-d0f95699106e-kube-api-access-hlcl7\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.715318 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-combined-ca-bundle\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.715364 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-scripts\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.715420 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-config-data\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.715497 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-fernet-keys\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.715534 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-credential-keys\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.778984 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/placement-db-sync-nk2fn"] Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.783817 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.788568 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"placement-config-data" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.789216 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"placement-scripts" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.789413 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"placement-placement-dockercfg-m5x9x" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.803944 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-db-sync-nk2fn"] Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.820123 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-scripts\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.820231 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-config-data\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.820355 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-fernet-keys\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.820442 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-credential-keys\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.820522 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlcl7\" (UniqueName: \"kubernetes.io/projected/39093594-4bc6-479d-9e74-d0f95699106e-kube-api-access-hlcl7\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.820868 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-combined-ca-bundle\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.824084 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-combined-ca-bundle\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.824088 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-scripts\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.824972 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-fernet-keys\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.825289 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-credential-keys\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.831264 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-config-data\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.839389 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlcl7\" (UniqueName: \"kubernetes.io/projected/39093594-4bc6-479d-9e74-d0f95699106e-kube-api-access-hlcl7\") pod \"keystone-bootstrap-p2klm\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.912867 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.921990 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-scripts\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.922042 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-config-data\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.922064 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8988f9d-f9a5-4611-b74f-12833fb5b143-logs\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.922080 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-combined-ca-bundle\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:31 crc kubenswrapper[4842]: I0311 19:09:31.922103 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72qq\" (UniqueName: \"kubernetes.io/projected/b8988f9d-f9a5-4611-b74f-12833fb5b143-kube-api-access-b72qq\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.033546 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-scripts\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.035227 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-config-data\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.035282 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8988f9d-f9a5-4611-b74f-12833fb5b143-logs\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.035306 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-combined-ca-bundle\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.035341 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b72qq\" (UniqueName: \"kubernetes.io/projected/b8988f9d-f9a5-4611-b74f-12833fb5b143-kube-api-access-b72qq\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.036247 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8988f9d-f9a5-4611-b74f-12833fb5b143-logs\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.040824 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-config-data\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.045323 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-combined-ca-bundle\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.067546 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-scripts\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.069478 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72qq\" (UniqueName: \"kubernetes.io/projected/b8988f9d-f9a5-4611-b74f-12833fb5b143-kube-api-access-b72qq\") pod \"placement-db-sync-nk2fn\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.104128 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.489123 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-p2klm"] Mar 11 19:09:32 crc kubenswrapper[4842]: W0311 19:09:32.493067 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39093594_4bc6_479d_9e74_d0f95699106e.slice/crio-eea3ee47923cf7a8bce4c830ad4b7e1a5e4d74d58888e69871244e46b5d705ac WatchSource:0}: Error finding container eea3ee47923cf7a8bce4c830ad4b7e1a5e4d74d58888e69871244e46b5d705ac: Status 404 returned error can't find the container with id eea3ee47923cf7a8bce4c830ad4b7e1a5e4d74d58888e69871244e46b5d705ac Mar 11 19:09:32 crc kubenswrapper[4842]: I0311 19:09:32.611279 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-db-sync-nk2fn"] Mar 11 19:09:33 crc kubenswrapper[4842]: I0311 19:09:33.386537 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-p2klm" event={"ID":"39093594-4bc6-479d-9e74-d0f95699106e","Type":"ContainerStarted","Data":"574593bc2b814737bcb08e25483588c41dae41b2f19d7eb546096a6a91a6f760"} Mar 11 19:09:33 crc kubenswrapper[4842]: I0311 19:09:33.387032 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-p2klm" event={"ID":"39093594-4bc6-479d-9e74-d0f95699106e","Type":"ContainerStarted","Data":"eea3ee47923cf7a8bce4c830ad4b7e1a5e4d74d58888e69871244e46b5d705ac"} Mar 11 19:09:33 crc kubenswrapper[4842]: I0311 19:09:33.392590 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-sync-nk2fn" event={"ID":"b8988f9d-f9a5-4611-b74f-12833fb5b143","Type":"ContainerStarted","Data":"dc931ad16110f581f4ea8604a3e5137f2b2face38f5e4f8a9535cb8d3bebcab4"} Mar 11 19:09:33 crc kubenswrapper[4842]: I0311 19:09:33.410047 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/keystone-bootstrap-p2klm" podStartSLOduration=2.410027386 podStartE2EDuration="2.410027386s" podCreationTimestamp="2026-03-11 19:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:09:33.402914437 +0000 UTC m=+1219.050610737" watchObservedRunningTime="2026-03-11 19:09:33.410027386 +0000 UTC m=+1219.057723666" Mar 11 19:09:36 crc kubenswrapper[4842]: I0311 19:09:36.440263 4842 generic.go:334] "Generic (PLEG): container finished" podID="39093594-4bc6-479d-9e74-d0f95699106e" containerID="574593bc2b814737bcb08e25483588c41dae41b2f19d7eb546096a6a91a6f760" exitCode=0 Mar 11 19:09:36 crc kubenswrapper[4842]: I0311 19:09:36.440311 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-p2klm" event={"ID":"39093594-4bc6-479d-9e74-d0f95699106e","Type":"ContainerDied","Data":"574593bc2b814737bcb08e25483588c41dae41b2f19d7eb546096a6a91a6f760"} Mar 11 19:09:36 crc kubenswrapper[4842]: I0311 19:09:36.444691 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-sync-nk2fn" event={"ID":"b8988f9d-f9a5-4611-b74f-12833fb5b143","Type":"ContainerStarted","Data":"17185d003426a59e826a7a36a128aa44c461dfa1f1d4a03d090fbd48ddb2e352"} Mar 11 19:09:36 crc kubenswrapper[4842]: I0311 19:09:36.502334 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/placement-db-sync-nk2fn" podStartSLOduration=2.416932909 podStartE2EDuration="5.502312331s" podCreationTimestamp="2026-03-11 19:09:31 +0000 UTC" firstStartedPulling="2026-03-11 19:09:32.614288662 +0000 UTC m=+1218.261984952" lastFinishedPulling="2026-03-11 19:09:35.699668094 +0000 UTC m=+1221.347364374" observedRunningTime="2026-03-11 19:09:36.498963117 +0000 UTC m=+1222.146659437" watchObservedRunningTime="2026-03-11 19:09:36.502312331 +0000 UTC m=+1222.150008621" Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.860989 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.934786 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-scripts\") pod \"39093594-4bc6-479d-9e74-d0f95699106e\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.934849 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-fernet-keys\") pod \"39093594-4bc6-479d-9e74-d0f95699106e\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.934884 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlcl7\" (UniqueName: \"kubernetes.io/projected/39093594-4bc6-479d-9e74-d0f95699106e-kube-api-access-hlcl7\") pod \"39093594-4bc6-479d-9e74-d0f95699106e\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.934904 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-credential-keys\") pod \"39093594-4bc6-479d-9e74-d0f95699106e\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.934925 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-combined-ca-bundle\") pod \"39093594-4bc6-479d-9e74-d0f95699106e\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.934966 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-config-data\") pod \"39093594-4bc6-479d-9e74-d0f95699106e\" (UID: \"39093594-4bc6-479d-9e74-d0f95699106e\") " Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.939907 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-scripts" (OuterVolumeSpecName: "scripts") pod "39093594-4bc6-479d-9e74-d0f95699106e" (UID: "39093594-4bc6-479d-9e74-d0f95699106e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.940385 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "39093594-4bc6-479d-9e74-d0f95699106e" (UID: "39093594-4bc6-479d-9e74-d0f95699106e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.941947 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39093594-4bc6-479d-9e74-d0f95699106e-kube-api-access-hlcl7" (OuterVolumeSpecName: "kube-api-access-hlcl7") pod "39093594-4bc6-479d-9e74-d0f95699106e" (UID: "39093594-4bc6-479d-9e74-d0f95699106e"). InnerVolumeSpecName "kube-api-access-hlcl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.943405 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "39093594-4bc6-479d-9e74-d0f95699106e" (UID: "39093594-4bc6-479d-9e74-d0f95699106e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.956917 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-config-data" (OuterVolumeSpecName: "config-data") pod "39093594-4bc6-479d-9e74-d0f95699106e" (UID: "39093594-4bc6-479d-9e74-d0f95699106e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:37 crc kubenswrapper[4842]: I0311 19:09:37.957056 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39093594-4bc6-479d-9e74-d0f95699106e" (UID: "39093594-4bc6-479d-9e74-d0f95699106e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.036835 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlcl7\" (UniqueName: \"kubernetes.io/projected/39093594-4bc6-479d-9e74-d0f95699106e-kube-api-access-hlcl7\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.037236 4842 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.037251 4842 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.037313 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.037336 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.037352 4842 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/39093594-4bc6-479d-9e74-d0f95699106e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.466874 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-p2klm" event={"ID":"39093594-4bc6-479d-9e74-d0f95699106e","Type":"ContainerDied","Data":"eea3ee47923cf7a8bce4c830ad4b7e1a5e4d74d58888e69871244e46b5d705ac"} Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.466946 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eea3ee47923cf7a8bce4c830ad4b7e1a5e4d74d58888e69871244e46b5d705ac" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.467930 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-p2klm" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.469753 4842 generic.go:334] "Generic (PLEG): container finished" podID="b8988f9d-f9a5-4611-b74f-12833fb5b143" containerID="17185d003426a59e826a7a36a128aa44c461dfa1f1d4a03d090fbd48ddb2e352" exitCode=0 Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.469788 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-sync-nk2fn" event={"ID":"b8988f9d-f9a5-4611-b74f-12833fb5b143","Type":"ContainerDied","Data":"17185d003426a59e826a7a36a128aa44c461dfa1f1d4a03d090fbd48ddb2e352"} Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.553172 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-p2klm"] Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.560763 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-p2klm"] Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.632462 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/keystone-bootstrap-sdw9r"] Mar 11 19:09:38 crc kubenswrapper[4842]: E0311 19:09:38.632801 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39093594-4bc6-479d-9e74-d0f95699106e" containerName="keystone-bootstrap" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.632818 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="39093594-4bc6-479d-9e74-d0f95699106e" containerName="keystone-bootstrap" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.632948 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="39093594-4bc6-479d-9e74-d0f95699106e" containerName="keystone-bootstrap" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.633428 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.635474 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.635546 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-scripts" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.636969 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-config-data" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.637746 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"osp-secret" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.637960 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-keystone-dockercfg-4p9fc" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.647859 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-sdw9r"] Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.746695 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-credential-keys\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.746780 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-scripts\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.746822 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-combined-ca-bundle\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.746886 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-config-data\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.746925 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-fernet-keys\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.746946 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfdg\" (UniqueName: \"kubernetes.io/projected/6edfaec1-653a-4bd8-b8b1-13180eefe66b-kube-api-access-ckfdg\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.848676 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-credential-keys\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.848828 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-scripts\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.848986 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-combined-ca-bundle\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.849802 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-config-data\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.849903 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-fernet-keys\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.849970 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfdg\" (UniqueName: \"kubernetes.io/projected/6edfaec1-653a-4bd8-b8b1-13180eefe66b-kube-api-access-ckfdg\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.854545 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-combined-ca-bundle\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.859152 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-config-data\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.859686 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-credential-keys\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.861658 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-scripts\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.864224 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-fernet-keys\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.868299 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfdg\" (UniqueName: \"kubernetes.io/projected/6edfaec1-653a-4bd8-b8b1-13180eefe66b-kube-api-access-ckfdg\") pod \"keystone-bootstrap-sdw9r\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.947849 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:38 crc kubenswrapper[4842]: I0311 19:09:38.980212 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39093594-4bc6-479d-9e74-d0f95699106e" path="/var/lib/kubelet/pods/39093594-4bc6-479d-9e74-d0f95699106e/volumes" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.464642 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-sdw9r"] Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.477172 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" event={"ID":"6edfaec1-653a-4bd8-b8b1-13180eefe66b","Type":"ContainerStarted","Data":"d659f0c6bee75d2ea06cc666370bbca28a8341013fb7a7c2d50a138e9465ab42"} Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.698890 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.762259 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-config-data\") pod \"b8988f9d-f9a5-4611-b74f-12833fb5b143\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.762352 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-combined-ca-bundle\") pod \"b8988f9d-f9a5-4611-b74f-12833fb5b143\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.762392 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-scripts\") pod \"b8988f9d-f9a5-4611-b74f-12833fb5b143\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.762442 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b72qq\" (UniqueName: \"kubernetes.io/projected/b8988f9d-f9a5-4611-b74f-12833fb5b143-kube-api-access-b72qq\") pod \"b8988f9d-f9a5-4611-b74f-12833fb5b143\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.762479 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8988f9d-f9a5-4611-b74f-12833fb5b143-logs\") pod \"b8988f9d-f9a5-4611-b74f-12833fb5b143\" (UID: \"b8988f9d-f9a5-4611-b74f-12833fb5b143\") " Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.763047 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8988f9d-f9a5-4611-b74f-12833fb5b143-logs" (OuterVolumeSpecName: "logs") pod "b8988f9d-f9a5-4611-b74f-12833fb5b143" (UID: "b8988f9d-f9a5-4611-b74f-12833fb5b143"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.767440 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-scripts" (OuterVolumeSpecName: "scripts") pod "b8988f9d-f9a5-4611-b74f-12833fb5b143" (UID: "b8988f9d-f9a5-4611-b74f-12833fb5b143"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.767448 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8988f9d-f9a5-4611-b74f-12833fb5b143-kube-api-access-b72qq" (OuterVolumeSpecName: "kube-api-access-b72qq") pod "b8988f9d-f9a5-4611-b74f-12833fb5b143" (UID: "b8988f9d-f9a5-4611-b74f-12833fb5b143"). InnerVolumeSpecName "kube-api-access-b72qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.781824 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8988f9d-f9a5-4611-b74f-12833fb5b143" (UID: "b8988f9d-f9a5-4611-b74f-12833fb5b143"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.781927 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-config-data" (OuterVolumeSpecName: "config-data") pod "b8988f9d-f9a5-4611-b74f-12833fb5b143" (UID: "b8988f9d-f9a5-4611-b74f-12833fb5b143"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.864567 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.864617 4842 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.864629 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8988f9d-f9a5-4611-b74f-12833fb5b143-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.864637 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b72qq\" (UniqueName: \"kubernetes.io/projected/b8988f9d-f9a5-4611-b74f-12833fb5b143-kube-api-access-b72qq\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:39 crc kubenswrapper[4842]: I0311 19:09:39.864646 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8988f9d-f9a5-4611-b74f-12833fb5b143-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.528026 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" event={"ID":"6edfaec1-653a-4bd8-b8b1-13180eefe66b","Type":"ContainerStarted","Data":"5973b07023f76418605b1d1a497ef3a022a1ae8c731d75c6374fd0cfefc88d8b"} Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.529712 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-db-sync-nk2fn" event={"ID":"b8988f9d-f9a5-4611-b74f-12833fb5b143","Type":"ContainerDied","Data":"dc931ad16110f581f4ea8604a3e5137f2b2face38f5e4f8a9535cb8d3bebcab4"} Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.529761 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc931ad16110f581f4ea8604a3e5137f2b2face38f5e4f8a9535cb8d3bebcab4" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.529833 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-db-sync-nk2fn" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.554834 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" podStartSLOduration=2.5548070899999997 podStartE2EDuration="2.55480709s" podCreationTimestamp="2026-03-11 19:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:09:40.55401822 +0000 UTC m=+1226.201714550" watchObservedRunningTime="2026-03-11 19:09:40.55480709 +0000 UTC m=+1226.202503370" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.941183 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/placement-6559b86c84-dxqm4"] Mar 11 19:09:40 crc kubenswrapper[4842]: E0311 19:09:40.942173 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8988f9d-f9a5-4611-b74f-12833fb5b143" containerName="placement-db-sync" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.942254 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8988f9d-f9a5-4611-b74f-12833fb5b143" containerName="placement-db-sync" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.943475 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8988f9d-f9a5-4611-b74f-12833fb5b143" containerName="placement-db-sync" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.944489 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.950974 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"placement-scripts" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.953390 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"placement-config-data" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.953579 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"placement-placement-dockercfg-m5x9x" Mar 11 19:09:40 crc kubenswrapper[4842]: I0311 19:09:40.954405 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-6559b86c84-dxqm4"] Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.088233 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8q4m\" (UniqueName: \"kubernetes.io/projected/1e66e9da-6930-456f-97d4-cee682c6a0c5-kube-api-access-j8q4m\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.088323 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-config-data\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.088438 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e66e9da-6930-456f-97d4-cee682c6a0c5-logs\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.088552 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-scripts\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.088572 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-combined-ca-bundle\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.190168 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-scripts\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.190221 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-combined-ca-bundle\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.190284 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8q4m\" (UniqueName: \"kubernetes.io/projected/1e66e9da-6930-456f-97d4-cee682c6a0c5-kube-api-access-j8q4m\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.190310 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-config-data\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.190339 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e66e9da-6930-456f-97d4-cee682c6a0c5-logs\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.190799 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e66e9da-6930-456f-97d4-cee682c6a0c5-logs\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.197192 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-combined-ca-bundle\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.201903 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-config-data\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.211786 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-scripts\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.218978 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8q4m\" (UniqueName: \"kubernetes.io/projected/1e66e9da-6930-456f-97d4-cee682c6a0c5-kube-api-access-j8q4m\") pod \"placement-6559b86c84-dxqm4\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.266922 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:41 crc kubenswrapper[4842]: I0311 19:09:41.765389 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-6559b86c84-dxqm4"] Mar 11 19:09:42 crc kubenswrapper[4842]: I0311 19:09:42.558615 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" event={"ID":"1e66e9da-6930-456f-97d4-cee682c6a0c5","Type":"ContainerStarted","Data":"9d3892e9b908d6045653740656760f4ff739fa4640d624846e052bcd940358ec"} Mar 11 19:09:43 crc kubenswrapper[4842]: I0311 19:09:43.571309 4842 generic.go:334] "Generic (PLEG): container finished" podID="6edfaec1-653a-4bd8-b8b1-13180eefe66b" containerID="5973b07023f76418605b1d1a497ef3a022a1ae8c731d75c6374fd0cfefc88d8b" exitCode=0 Mar 11 19:09:43 crc kubenswrapper[4842]: I0311 19:09:43.571333 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" event={"ID":"6edfaec1-653a-4bd8-b8b1-13180eefe66b","Type":"ContainerDied","Data":"5973b07023f76418605b1d1a497ef3a022a1ae8c731d75c6374fd0cfefc88d8b"} Mar 11 19:09:43 crc kubenswrapper[4842]: I0311 19:09:43.574470 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" event={"ID":"1e66e9da-6930-456f-97d4-cee682c6a0c5","Type":"ContainerStarted","Data":"d6850e1973b6ba9c5129a22de0cb4eacb85f1d7292b24786969124945fa887d0"} Mar 11 19:09:43 crc kubenswrapper[4842]: I0311 19:09:43.574512 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" event={"ID":"1e66e9da-6930-456f-97d4-cee682c6a0c5","Type":"ContainerStarted","Data":"5b0b6d2741c28ce5ca92ee81e43bfb40e0ba7761e2fae35c212e438e0980826b"} Mar 11 19:09:43 crc kubenswrapper[4842]: I0311 19:09:43.574741 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:43 crc kubenswrapper[4842]: I0311 19:09:43.631507 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" podStartSLOduration=3.631473303 podStartE2EDuration="3.631473303s" podCreationTimestamp="2026-03-11 19:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:09:43.618653331 +0000 UTC m=+1229.266349651" watchObservedRunningTime="2026-03-11 19:09:43.631473303 +0000 UTC m=+1229.279169613" Mar 11 19:09:44 crc kubenswrapper[4842]: I0311 19:09:44.581280 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:09:44 crc kubenswrapper[4842]: I0311 19:09:44.954745 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.066722 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-config-data\") pod \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.066777 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-credential-keys\") pod \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.066802 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckfdg\" (UniqueName: \"kubernetes.io/projected/6edfaec1-653a-4bd8-b8b1-13180eefe66b-kube-api-access-ckfdg\") pod \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.066824 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-combined-ca-bundle\") pod \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.066848 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-fernet-keys\") pod \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.066915 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-scripts\") pod \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\" (UID: \"6edfaec1-653a-4bd8-b8b1-13180eefe66b\") " Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.072934 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6edfaec1-653a-4bd8-b8b1-13180eefe66b" (UID: "6edfaec1-653a-4bd8-b8b1-13180eefe66b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.085107 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6edfaec1-653a-4bd8-b8b1-13180eefe66b" (UID: "6edfaec1-653a-4bd8-b8b1-13180eefe66b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.085477 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edfaec1-653a-4bd8-b8b1-13180eefe66b-kube-api-access-ckfdg" (OuterVolumeSpecName: "kube-api-access-ckfdg") pod "6edfaec1-653a-4bd8-b8b1-13180eefe66b" (UID: "6edfaec1-653a-4bd8-b8b1-13180eefe66b"). InnerVolumeSpecName "kube-api-access-ckfdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.086193 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-scripts" (OuterVolumeSpecName: "scripts") pod "6edfaec1-653a-4bd8-b8b1-13180eefe66b" (UID: "6edfaec1-653a-4bd8-b8b1-13180eefe66b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.092218 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-config-data" (OuterVolumeSpecName: "config-data") pod "6edfaec1-653a-4bd8-b8b1-13180eefe66b" (UID: "6edfaec1-653a-4bd8-b8b1-13180eefe66b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.099110 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6edfaec1-653a-4bd8-b8b1-13180eefe66b" (UID: "6edfaec1-653a-4bd8-b8b1-13180eefe66b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.168462 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.168508 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.168525 4842 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.168543 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckfdg\" (UniqueName: \"kubernetes.io/projected/6edfaec1-653a-4bd8-b8b1-13180eefe66b-kube-api-access-ckfdg\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.168559 4842 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.168574 4842 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6edfaec1-653a-4bd8-b8b1-13180eefe66b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.593307 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" event={"ID":"6edfaec1-653a-4bd8-b8b1-13180eefe66b","Type":"ContainerDied","Data":"d659f0c6bee75d2ea06cc666370bbca28a8341013fb7a7c2d50a138e9465ab42"} Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.593363 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d659f0c6bee75d2ea06cc666370bbca28a8341013fb7a7c2d50a138e9465ab42" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.593413 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-bootstrap-sdw9r" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.808592 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/keystone-64f57b6d8c-cz78k"] Mar 11 19:09:45 crc kubenswrapper[4842]: E0311 19:09:45.808937 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6edfaec1-653a-4bd8-b8b1-13180eefe66b" containerName="keystone-bootstrap" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.808950 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6edfaec1-653a-4bd8-b8b1-13180eefe66b" containerName="keystone-bootstrap" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.809084 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6edfaec1-653a-4bd8-b8b1-13180eefe66b" containerName="keystone-bootstrap" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.809574 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.814347 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-config-data" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.814608 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-scripts" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.814479 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone-keystone-dockercfg-4p9fc" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.814726 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"keystone" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.834739 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-64f57b6d8c-cz78k"] Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.883826 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mkw\" (UniqueName: \"kubernetes.io/projected/4291a0cb-5c38-424b-bc49-301aab1e1f1a-kube-api-access-t2mkw\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.883900 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-config-data\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.883934 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-credential-keys\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.884202 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-combined-ca-bundle\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.884340 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-scripts\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.884388 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-fernet-keys\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.986372 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-config-data\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.986453 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-credential-keys\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.986507 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-combined-ca-bundle\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.986544 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-scripts\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.986568 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-fernet-keys\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.986685 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mkw\" (UniqueName: \"kubernetes.io/projected/4291a0cb-5c38-424b-bc49-301aab1e1f1a-kube-api-access-t2mkw\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.991632 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-combined-ca-bundle\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.994739 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-fernet-keys\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.995295 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-credential-keys\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.995346 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-scripts\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:45 crc kubenswrapper[4842]: I0311 19:09:45.998070 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4291a0cb-5c38-424b-bc49-301aab1e1f1a-config-data\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:46 crc kubenswrapper[4842]: I0311 19:09:46.034989 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mkw\" (UniqueName: \"kubernetes.io/projected/4291a0cb-5c38-424b-bc49-301aab1e1f1a-kube-api-access-t2mkw\") pod \"keystone-64f57b6d8c-cz78k\" (UID: \"4291a0cb-5c38-424b-bc49-301aab1e1f1a\") " pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:46 crc kubenswrapper[4842]: I0311 19:09:46.128342 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:46 crc kubenswrapper[4842]: I0311 19:09:46.641305 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/keystone-64f57b6d8c-cz78k"] Mar 11 19:09:46 crc kubenswrapper[4842]: W0311 19:09:46.648498 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4291a0cb_5c38_424b_bc49_301aab1e1f1a.slice/crio-336f848dedeb5cc10d08333f8c7cee485e97ba2c4571d2288422fbc333dbd9cc WatchSource:0}: Error finding container 336f848dedeb5cc10d08333f8c7cee485e97ba2c4571d2288422fbc333dbd9cc: Status 404 returned error can't find the container with id 336f848dedeb5cc10d08333f8c7cee485e97ba2c4571d2288422fbc333dbd9cc Mar 11 19:09:47 crc kubenswrapper[4842]: I0311 19:09:47.612929 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" event={"ID":"4291a0cb-5c38-424b-bc49-301aab1e1f1a","Type":"ContainerStarted","Data":"e1dbf4b7ecb31ccda2d2248a4fa3ec1ae25749c499e89c50b1e6aeba5732c6e0"} Mar 11 19:09:47 crc kubenswrapper[4842]: I0311 19:09:47.612986 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" event={"ID":"4291a0cb-5c38-424b-bc49-301aab1e1f1a","Type":"ContainerStarted","Data":"336f848dedeb5cc10d08333f8c7cee485e97ba2c4571d2288422fbc333dbd9cc"} Mar 11 19:09:47 crc kubenswrapper[4842]: I0311 19:09:47.613116 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:09:47 crc kubenswrapper[4842]: I0311 19:09:47.680953 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" podStartSLOduration=2.680917967 podStartE2EDuration="2.680917967s" podCreationTimestamp="2026-03-11 19:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:09:47.654074136 +0000 UTC m=+1233.301770456" watchObservedRunningTime="2026-03-11 19:09:47.680917967 +0000 UTC m=+1233.328614247" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.157222 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554270-4phpn"] Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.159171 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554270-4phpn" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.163500 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.163645 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.163717 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.169418 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554270-4phpn"] Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.282060 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d42k\" (UniqueName: \"kubernetes.io/projected/9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf-kube-api-access-8d42k\") pod \"auto-csr-approver-29554270-4phpn\" (UID: \"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf\") " pod="openshift-infra/auto-csr-approver-29554270-4phpn" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.384017 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d42k\" (UniqueName: \"kubernetes.io/projected/9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf-kube-api-access-8d42k\") pod \"auto-csr-approver-29554270-4phpn\" (UID: \"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf\") " pod="openshift-infra/auto-csr-approver-29554270-4phpn" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.406729 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d42k\" (UniqueName: \"kubernetes.io/projected/9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf-kube-api-access-8d42k\") pod \"auto-csr-approver-29554270-4phpn\" (UID: \"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf\") " pod="openshift-infra/auto-csr-approver-29554270-4phpn" Mar 11 19:10:00 crc kubenswrapper[4842]: I0311 19:10:00.492541 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554270-4phpn" Mar 11 19:10:01 crc kubenswrapper[4842]: W0311 19:10:01.030216 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e3b5fd4_ea6b_4622_b97a_018a6f8bcedf.slice/crio-4c3c7e777611731921fb785135ba917aaacdeb485f1267d17a1d063dc7b5af46 WatchSource:0}: Error finding container 4c3c7e777611731921fb785135ba917aaacdeb485f1267d17a1d063dc7b5af46: Status 404 returned error can't find the container with id 4c3c7e777611731921fb785135ba917aaacdeb485f1267d17a1d063dc7b5af46 Mar 11 19:10:01 crc kubenswrapper[4842]: I0311 19:10:01.031110 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554270-4phpn"] Mar 11 19:10:01 crc kubenswrapper[4842]: I0311 19:10:01.760494 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554270-4phpn" event={"ID":"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf","Type":"ContainerStarted","Data":"4c3c7e777611731921fb785135ba917aaacdeb485f1267d17a1d063dc7b5af46"} Mar 11 19:10:02 crc kubenswrapper[4842]: I0311 19:10:02.770514 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554270-4phpn" event={"ID":"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf","Type":"ContainerStarted","Data":"bb244240fb85d311334659590232a25399b599cb31f8c9d93233e2e094d0b8f6"} Mar 11 19:10:02 crc kubenswrapper[4842]: I0311 19:10:02.790471 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29554270-4phpn" podStartSLOduration=1.5854885909999998 podStartE2EDuration="2.790447213s" podCreationTimestamp="2026-03-11 19:10:00 +0000 UTC" firstStartedPulling="2026-03-11 19:10:01.051026932 +0000 UTC m=+1246.698723232" lastFinishedPulling="2026-03-11 19:10:02.255985534 +0000 UTC m=+1247.903681854" observedRunningTime="2026-03-11 19:10:02.788758528 +0000 UTC m=+1248.436454808" watchObservedRunningTime="2026-03-11 19:10:02.790447213 +0000 UTC m=+1248.438143524" Mar 11 19:10:03 crc kubenswrapper[4842]: I0311 19:10:03.779015 4842 generic.go:334] "Generic (PLEG): container finished" podID="9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf" containerID="bb244240fb85d311334659590232a25399b599cb31f8c9d93233e2e094d0b8f6" exitCode=0 Mar 11 19:10:03 crc kubenswrapper[4842]: I0311 19:10:03.779075 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554270-4phpn" event={"ID":"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf","Type":"ContainerDied","Data":"bb244240fb85d311334659590232a25399b599cb31f8c9d93233e2e094d0b8f6"} Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.134591 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554270-4phpn" Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.179736 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d42k\" (UniqueName: \"kubernetes.io/projected/9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf-kube-api-access-8d42k\") pod \"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf\" (UID: \"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf\") " Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.200191 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf-kube-api-access-8d42k" (OuterVolumeSpecName: "kube-api-access-8d42k") pod "9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf" (UID: "9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf"). InnerVolumeSpecName "kube-api-access-8d42k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.281616 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d42k\" (UniqueName: \"kubernetes.io/projected/9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf-kube-api-access-8d42k\") on node \"crc\" DevicePath \"\"" Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.798006 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554270-4phpn" event={"ID":"9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf","Type":"ContainerDied","Data":"4c3c7e777611731921fb785135ba917aaacdeb485f1267d17a1d063dc7b5af46"} Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.798058 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c3c7e777611731921fb785135ba917aaacdeb485f1267d17a1d063dc7b5af46" Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.798160 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554270-4phpn" Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.879513 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554264-265lh"] Mar 11 19:10:05 crc kubenswrapper[4842]: I0311 19:10:05.887387 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554264-265lh"] Mar 11 19:10:06 crc kubenswrapper[4842]: I0311 19:10:06.974365 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6340e59d-3320-4e33-872c-5e809b37cf69" path="/var/lib/kubelet/pods/6340e59d-3320-4e33-872c-5e809b37cf69/volumes" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.295967 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.297754 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.540817 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/placement-79b56db87d-ltvb2"] Mar 11 19:10:12 crc kubenswrapper[4842]: E0311 19:10:12.541576 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf" containerName="oc" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.541590 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf" containerName="oc" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.541757 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf" containerName="oc" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.542545 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.559859 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-79b56db87d-ltvb2"] Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.650518 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-combined-ca-bundle\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.650589 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-config-data\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.650748 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbl2w\" (UniqueName: \"kubernetes.io/projected/35af45e3-739f-4769-a843-c951ad001e2e-kube-api-access-dbl2w\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.650826 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-scripts\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.650963 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35af45e3-739f-4769-a843-c951ad001e2e-logs\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.751883 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-combined-ca-bundle\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.751931 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-config-data\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.751968 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbl2w\" (UniqueName: \"kubernetes.io/projected/35af45e3-739f-4769-a843-c951ad001e2e-kube-api-access-dbl2w\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.751992 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-scripts\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.752030 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35af45e3-739f-4769-a843-c951ad001e2e-logs\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.752653 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35af45e3-739f-4769-a843-c951ad001e2e-logs\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.759046 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-scripts\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.763202 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-combined-ca-bundle\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.768057 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35af45e3-739f-4769-a843-c951ad001e2e-config-data\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.770547 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbl2w\" (UniqueName: \"kubernetes.io/projected/35af45e3-739f-4769-a843-c951ad001e2e-kube-api-access-dbl2w\") pod \"placement-79b56db87d-ltvb2\" (UID: \"35af45e3-739f-4769-a843-c951ad001e2e\") " pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:12 crc kubenswrapper[4842]: I0311 19:10:12.865647 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:13 crc kubenswrapper[4842]: I0311 19:10:13.149333 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/placement-79b56db87d-ltvb2"] Mar 11 19:10:13 crc kubenswrapper[4842]: I0311 19:10:13.882560 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" event={"ID":"35af45e3-739f-4769-a843-c951ad001e2e","Type":"ContainerStarted","Data":"e1c4c70197a2021705d8a2e3a3e7e75bc81bb299a144d63371c351988e9bc719"} Mar 11 19:10:13 crc kubenswrapper[4842]: I0311 19:10:13.883313 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:13 crc kubenswrapper[4842]: I0311 19:10:13.883353 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" event={"ID":"35af45e3-739f-4769-a843-c951ad001e2e","Type":"ContainerStarted","Data":"01d74dabd9957fca2e82126650fc2d4e5aada360d5de06856451cf120ad0bbcd"} Mar 11 19:10:13 crc kubenswrapper[4842]: I0311 19:10:13.883562 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" event={"ID":"35af45e3-739f-4769-a843-c951ad001e2e","Type":"ContainerStarted","Data":"ea451c25bb1e182d48bb709338f5f151d2465f406f61b5d05863b6c6c90dfbb7"} Mar 11 19:10:13 crc kubenswrapper[4842]: I0311 19:10:13.922072 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" podStartSLOduration=1.9220462280000001 podStartE2EDuration="1.922046228s" podCreationTimestamp="2026-03-11 19:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:10:13.90907514 +0000 UTC m=+1259.556771430" watchObservedRunningTime="2026-03-11 19:10:13.922046228 +0000 UTC m=+1259.569742518" Mar 11 19:10:14 crc kubenswrapper[4842]: I0311 19:10:14.903891 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:17 crc kubenswrapper[4842]: I0311 19:10:17.594976 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/keystone-64f57b6d8c-cz78k" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.152324 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/openstackclient"] Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.153678 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.156519 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"nova-kuttl-default"/"openstack-config" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.158634 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"openstackclient-openstackclient-dockercfg-mqgnn" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.166052 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/openstackclient"] Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.166396 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"openstack-config-secret" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.284301 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.284357 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.284391 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-openstack-config\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.284426 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2md5d\" (UniqueName: \"kubernetes.io/projected/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-kube-api-access-2md5d\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.386606 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2md5d\" (UniqueName: \"kubernetes.io/projected/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-kube-api-access-2md5d\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.387039 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.387078 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.387117 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-openstack-config\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.388108 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-openstack-config\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.393706 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.397591 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.409066 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2md5d\" (UniqueName: \"kubernetes.io/projected/b7dcae57-2024-4bfa-b657-f16d16bfd6c7-kube-api-access-2md5d\") pod \"openstackclient\" (UID: \"b7dcae57-2024-4bfa-b657-f16d16bfd6c7\") " pod="nova-kuttl-default/openstackclient" Mar 11 19:10:18 crc kubenswrapper[4842]: I0311 19:10:18.478072 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/openstackclient" Mar 11 19:10:19 crc kubenswrapper[4842]: I0311 19:10:18.745424 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/openstackclient"] Mar 11 19:10:19 crc kubenswrapper[4842]: I0311 19:10:18.944566 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstackclient" event={"ID":"b7dcae57-2024-4bfa-b657-f16d16bfd6c7","Type":"ContainerStarted","Data":"e5da632ab157c198e4034bb21bff4fe66c1bee7189ab514fdf7516fa42e24850"} Mar 11 19:10:27 crc kubenswrapper[4842]: I0311 19:10:27.023891 4842 scope.go:117] "RemoveContainer" containerID="f440bca9948b779e9200ccc88fc942abc9fa18ee2907d6784a57858fafbc21f2" Mar 11 19:10:28 crc kubenswrapper[4842]: I0311 19:10:28.025461 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/openstackclient" event={"ID":"b7dcae57-2024-4bfa-b657-f16d16bfd6c7","Type":"ContainerStarted","Data":"f23f85cf5dcc4edfc7f3d5256fd7237984d8fa2eaa8cfc7cdf7758032a85c377"} Mar 11 19:10:28 crc kubenswrapper[4842]: I0311 19:10:28.058780 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/openstackclient" podStartSLOduration=1.934642736 podStartE2EDuration="10.058742464s" podCreationTimestamp="2026-03-11 19:10:18 +0000 UTC" firstStartedPulling="2026-03-11 19:10:18.757218264 +0000 UTC m=+1264.404914544" lastFinishedPulling="2026-03-11 19:10:26.881317992 +0000 UTC m=+1272.529014272" observedRunningTime="2026-03-11 19:10:28.047846171 +0000 UTC m=+1273.695542481" watchObservedRunningTime="2026-03-11 19:10:28.058742464 +0000 UTC m=+1273.706438774" Mar 11 19:10:43 crc kubenswrapper[4842]: I0311 19:10:43.837233 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:43 crc kubenswrapper[4842]: I0311 19:10:43.837945 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/placement-79b56db87d-ltvb2" Mar 11 19:10:43 crc kubenswrapper[4842]: I0311 19:10:43.954916 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/placement-6559b86c84-dxqm4"] Mar 11 19:10:43 crc kubenswrapper[4842]: I0311 19:10:43.955455 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-log" containerID="cri-o://5b0b6d2741c28ce5ca92ee81e43bfb40e0ba7761e2fae35c212e438e0980826b" gracePeriod=30 Mar 11 19:10:43 crc kubenswrapper[4842]: I0311 19:10:43.957919 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-api" containerID="cri-o://d6850e1973b6ba9c5129a22de0cb4eacb85f1d7292b24786969124945fa887d0" gracePeriod=30 Mar 11 19:10:44 crc kubenswrapper[4842]: I0311 19:10:44.168842 4842 generic.go:334] "Generic (PLEG): container finished" podID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerID="5b0b6d2741c28ce5ca92ee81e43bfb40e0ba7761e2fae35c212e438e0980826b" exitCode=143 Mar 11 19:10:44 crc kubenswrapper[4842]: I0311 19:10:44.168886 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" event={"ID":"1e66e9da-6930-456f-97d4-cee682c6a0c5","Type":"ContainerDied","Data":"5b0b6d2741c28ce5ca92ee81e43bfb40e0ba7761e2fae35c212e438e0980826b"} Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.195546 4842 generic.go:334] "Generic (PLEG): container finished" podID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerID="d6850e1973b6ba9c5129a22de0cb4eacb85f1d7292b24786969124945fa887d0" exitCode=0 Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.196923 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" event={"ID":"1e66e9da-6930-456f-97d4-cee682c6a0c5","Type":"ContainerDied","Data":"d6850e1973b6ba9c5129a22de0cb4eacb85f1d7292b24786969124945fa887d0"} Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.482319 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.648028 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-config-data\") pod \"1e66e9da-6930-456f-97d4-cee682c6a0c5\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.648087 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-scripts\") pod \"1e66e9da-6930-456f-97d4-cee682c6a0c5\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.648177 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e66e9da-6930-456f-97d4-cee682c6a0c5-logs\") pod \"1e66e9da-6930-456f-97d4-cee682c6a0c5\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.648249 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-combined-ca-bundle\") pod \"1e66e9da-6930-456f-97d4-cee682c6a0c5\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.648353 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8q4m\" (UniqueName: \"kubernetes.io/projected/1e66e9da-6930-456f-97d4-cee682c6a0c5-kube-api-access-j8q4m\") pod \"1e66e9da-6930-456f-97d4-cee682c6a0c5\" (UID: \"1e66e9da-6930-456f-97d4-cee682c6a0c5\") " Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.649421 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e66e9da-6930-456f-97d4-cee682c6a0c5-logs" (OuterVolumeSpecName: "logs") pod "1e66e9da-6930-456f-97d4-cee682c6a0c5" (UID: "1e66e9da-6930-456f-97d4-cee682c6a0c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.653900 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-scripts" (OuterVolumeSpecName: "scripts") pod "1e66e9da-6930-456f-97d4-cee682c6a0c5" (UID: "1e66e9da-6930-456f-97d4-cee682c6a0c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.663754 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e66e9da-6930-456f-97d4-cee682c6a0c5-kube-api-access-j8q4m" (OuterVolumeSpecName: "kube-api-access-j8q4m") pod "1e66e9da-6930-456f-97d4-cee682c6a0c5" (UID: "1e66e9da-6930-456f-97d4-cee682c6a0c5"). InnerVolumeSpecName "kube-api-access-j8q4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.686095 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e66e9da-6930-456f-97d4-cee682c6a0c5" (UID: "1e66e9da-6930-456f-97d4-cee682c6a0c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.686881 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-config-data" (OuterVolumeSpecName: "config-data") pod "1e66e9da-6930-456f-97d4-cee682c6a0c5" (UID: "1e66e9da-6930-456f-97d4-cee682c6a0c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.749916 4842 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.749952 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8q4m\" (UniqueName: \"kubernetes.io/projected/1e66e9da-6930-456f-97d4-cee682c6a0c5-kube-api-access-j8q4m\") on node \"crc\" DevicePath \"\"" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.749965 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.749973 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e66e9da-6930-456f-97d4-cee682c6a0c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:10:47 crc kubenswrapper[4842]: I0311 19:10:47.749982 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e66e9da-6930-456f-97d4-cee682c6a0c5-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:10:48 crc kubenswrapper[4842]: I0311 19:10:48.207149 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" event={"ID":"1e66e9da-6930-456f-97d4-cee682c6a0c5","Type":"ContainerDied","Data":"9d3892e9b908d6045653740656760f4ff739fa4640d624846e052bcd940358ec"} Mar 11 19:10:48 crc kubenswrapper[4842]: I0311 19:10:48.207214 4842 scope.go:117] "RemoveContainer" containerID="d6850e1973b6ba9c5129a22de0cb4eacb85f1d7292b24786969124945fa887d0" Mar 11 19:10:48 crc kubenswrapper[4842]: I0311 19:10:48.207321 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/placement-6559b86c84-dxqm4" Mar 11 19:10:48 crc kubenswrapper[4842]: I0311 19:10:48.237077 4842 scope.go:117] "RemoveContainer" containerID="5b0b6d2741c28ce5ca92ee81e43bfb40e0ba7761e2fae35c212e438e0980826b" Mar 11 19:10:48 crc kubenswrapper[4842]: I0311 19:10:48.278961 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/placement-6559b86c84-dxqm4"] Mar 11 19:10:48 crc kubenswrapper[4842]: I0311 19:10:48.283149 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/placement-6559b86c84-dxqm4"] Mar 11 19:10:48 crc kubenswrapper[4842]: I0311 19:10:48.971616 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" path="/var/lib/kubelet/pods/1e66e9da-6930-456f-97d4-cee682c6a0c5/volumes" Mar 11 19:11:01 crc kubenswrapper[4842]: I0311 19:11:01.472078 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:11:01 crc kubenswrapper[4842]: I0311 19:11:01.473042 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.391065 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx"] Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.391974 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" podUID="68ca87ac-6c44-49f1-b128-9593caa6b74c" containerName="manager" containerID="cri-o://f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8" gracePeriod=10 Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.494696 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw"] Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.494949 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" podUID="3aa48e36-bcbd-4033-9cee-fc43aefb1b9a" containerName="operator" containerID="cri-o://1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b" gracePeriod=10 Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.825368 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-index-ff8nw"] Mar 11 19:11:02 crc kubenswrapper[4842]: E0311 19:11:02.826723 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-api" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.826741 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-api" Mar 11 19:11:02 crc kubenswrapper[4842]: E0311 19:11:02.826795 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-log" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.826802 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-log" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.827101 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-api" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.827131 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e66e9da-6930-456f-97d4-cee682c6a0c5" containerName="placement-log" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.828023 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-index-ff8nw" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.909467 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-index-dockercfg-9jm4p" Mar 11 19:11:02 crc kubenswrapper[4842]: I0311 19:11:02.915843 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-index-ff8nw"] Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.048385 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfc9b\" (UniqueName: \"kubernetes.io/projected/e4afa999-7423-47dd-b966-3d81ee2cb0e3-kube-api-access-zfc9b\") pod \"nova-operator-index-ff8nw\" (UID: \"e4afa999-7423-47dd-b966-3d81ee2cb0e3\") " pod="openstack-operators/nova-operator-index-ff8nw" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.113019 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.149997 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfc9b\" (UniqueName: \"kubernetes.io/projected/e4afa999-7423-47dd-b966-3d81ee2cb0e3-kube-api-access-zfc9b\") pod \"nova-operator-index-ff8nw\" (UID: \"e4afa999-7423-47dd-b966-3d81ee2cb0e3\") " pod="openstack-operators/nova-operator-index-ff8nw" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.163771 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.179124 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfc9b\" (UniqueName: \"kubernetes.io/projected/e4afa999-7423-47dd-b966-3d81ee2cb0e3-kube-api-access-zfc9b\") pod \"nova-operator-index-ff8nw\" (UID: \"e4afa999-7423-47dd-b966-3d81ee2cb0e3\") " pod="openstack-operators/nova-operator-index-ff8nw" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.239660 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-index-ff8nw" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.252600 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4cbn\" (UniqueName: \"kubernetes.io/projected/3aa48e36-bcbd-4033-9cee-fc43aefb1b9a-kube-api-access-x4cbn\") pod \"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a\" (UID: \"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a\") " Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.252697 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f5xk\" (UniqueName: \"kubernetes.io/projected/68ca87ac-6c44-49f1-b128-9593caa6b74c-kube-api-access-9f5xk\") pod \"68ca87ac-6c44-49f1-b128-9593caa6b74c\" (UID: \"68ca87ac-6c44-49f1-b128-9593caa6b74c\") " Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.260208 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa48e36-bcbd-4033-9cee-fc43aefb1b9a-kube-api-access-x4cbn" (OuterVolumeSpecName: "kube-api-access-x4cbn") pod "3aa48e36-bcbd-4033-9cee-fc43aefb1b9a" (UID: "3aa48e36-bcbd-4033-9cee-fc43aefb1b9a"). InnerVolumeSpecName "kube-api-access-x4cbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.260457 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ca87ac-6c44-49f1-b128-9593caa6b74c-kube-api-access-9f5xk" (OuterVolumeSpecName: "kube-api-access-9f5xk") pod "68ca87ac-6c44-49f1-b128-9593caa6b74c" (UID: "68ca87ac-6c44-49f1-b128-9593caa6b74c"). InnerVolumeSpecName "kube-api-access-9f5xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.356126 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4cbn\" (UniqueName: \"kubernetes.io/projected/3aa48e36-bcbd-4033-9cee-fc43aefb1b9a-kube-api-access-x4cbn\") on node \"crc\" DevicePath \"\"" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.356532 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f5xk\" (UniqueName: \"kubernetes.io/projected/68ca87ac-6c44-49f1-b128-9593caa6b74c-kube-api-access-9f5xk\") on node \"crc\" DevicePath \"\"" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.372089 4842 generic.go:334] "Generic (PLEG): container finished" podID="68ca87ac-6c44-49f1-b128-9593caa6b74c" containerID="f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8" exitCode=0 Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.372236 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.372252 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" event={"ID":"68ca87ac-6c44-49f1-b128-9593caa6b74c","Type":"ContainerDied","Data":"f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8"} Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.372363 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx" event={"ID":"68ca87ac-6c44-49f1-b128-9593caa6b74c","Type":"ContainerDied","Data":"aed4dd5203b88539dc24f182e767849968c0405846ba88aa02c279dbb82f8d63"} Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.372399 4842 scope.go:117] "RemoveContainer" containerID="f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.377181 4842 generic.go:334] "Generic (PLEG): container finished" podID="3aa48e36-bcbd-4033-9cee-fc43aefb1b9a" containerID="1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b" exitCode=0 Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.377248 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" event={"ID":"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a","Type":"ContainerDied","Data":"1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b"} Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.377301 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" event={"ID":"3aa48e36-bcbd-4033-9cee-fc43aefb1b9a","Type":"ContainerDied","Data":"e3d2fe1edd16831310f02c7d66d3b382863188a18ed5220c89118d8f29aa6369"} Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.377354 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.398087 4842 scope.go:117] "RemoveContainer" containerID="f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8" Mar 11 19:11:03 crc kubenswrapper[4842]: E0311 19:11:03.399105 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8\": container with ID starting with f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8 not found: ID does not exist" containerID="f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.399252 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8"} err="failed to get container status \"f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8\": rpc error: code = NotFound desc = could not find container \"f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8\": container with ID starting with f0e98d61cf2cd3e624811d355dd842f7c0882caca1fcd9e40284dab18f030ea8 not found: ID does not exist" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.399302 4842 scope.go:117] "RemoveContainer" containerID="1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.420791 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx"] Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.432593 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/nova-operator-controller-manager-67b8c8c6bd-9kcrx"] Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.441416 4842 scope.go:117] "RemoveContainer" containerID="1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b" Mar 11 19:11:03 crc kubenswrapper[4842]: E0311 19:11:03.441884 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b\": container with ID starting with 1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b not found: ID does not exist" containerID="1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.441941 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b"} err="failed to get container status \"1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b\": rpc error: code = NotFound desc = could not find container \"1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b\": container with ID starting with 1554e5d3d67d751ecd2afd8a911698dacb0878e85c5f230ba7a130d432eb543b not found: ID does not exist" Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.445383 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw"] Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.454011 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58577bcd48-dkjvw"] Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.844747 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-index-ff8nw"] Mar 11 19:11:03 crc kubenswrapper[4842]: W0311 19:11:03.850336 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4afa999_7423_47dd_b966_3d81ee2cb0e3.slice/crio-9b2cd677c24c76c0ecedf0bfea638383a19120e76a7dff06b0cd84ee0a9e10b7 WatchSource:0}: Error finding container 9b2cd677c24c76c0ecedf0bfea638383a19120e76a7dff06b0cd84ee0a9e10b7: Status 404 returned error can't find the container with id 9b2cd677c24c76c0ecedf0bfea638383a19120e76a7dff06b0cd84ee0a9e10b7 Mar 11 19:11:03 crc kubenswrapper[4842]: I0311 19:11:03.853759 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 19:11:04 crc kubenswrapper[4842]: I0311 19:11:04.388490 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-index-ff8nw" event={"ID":"e4afa999-7423-47dd-b966-3d81ee2cb0e3","Type":"ContainerStarted","Data":"631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c"} Mar 11 19:11:04 crc kubenswrapper[4842]: I0311 19:11:04.388868 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-index-ff8nw" event={"ID":"e4afa999-7423-47dd-b966-3d81ee2cb0e3","Type":"ContainerStarted","Data":"9b2cd677c24c76c0ecedf0bfea638383a19120e76a7dff06b0cd84ee0a9e10b7"} Mar 11 19:11:04 crc kubenswrapper[4842]: I0311 19:11:04.973532 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aa48e36-bcbd-4033-9cee-fc43aefb1b9a" path="/var/lib/kubelet/pods/3aa48e36-bcbd-4033-9cee-fc43aefb1b9a/volumes" Mar 11 19:11:04 crc kubenswrapper[4842]: I0311 19:11:04.973992 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68ca87ac-6c44-49f1-b128-9593caa6b74c" path="/var/lib/kubelet/pods/68ca87ac-6c44-49f1-b128-9593caa6b74c/volumes" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.440812 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-index-ff8nw" podStartSLOduration=3.210429554 podStartE2EDuration="3.440788978s" podCreationTimestamp="2026-03-11 19:11:02 +0000 UTC" firstStartedPulling="2026-03-11 19:11:03.853412011 +0000 UTC m=+1309.501108291" lastFinishedPulling="2026-03-11 19:11:04.083771435 +0000 UTC m=+1309.731467715" observedRunningTime="2026-03-11 19:11:04.408201526 +0000 UTC m=+1310.055897836" watchObservedRunningTime="2026-03-11 19:11:05.440788978 +0000 UTC m=+1311.088485258" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.444584 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-index-ff8nw"] Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.858772 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-index-wln8t"] Mar 11 19:11:05 crc kubenswrapper[4842]: E0311 19:11:05.859173 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ca87ac-6c44-49f1-b128-9593caa6b74c" containerName="manager" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.859190 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ca87ac-6c44-49f1-b128-9593caa6b74c" containerName="manager" Mar 11 19:11:05 crc kubenswrapper[4842]: E0311 19:11:05.859209 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa48e36-bcbd-4033-9cee-fc43aefb1b9a" containerName="operator" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.859216 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa48e36-bcbd-4033-9cee-fc43aefb1b9a" containerName="operator" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.859393 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa48e36-bcbd-4033-9cee-fc43aefb1b9a" containerName="operator" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.859410 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ca87ac-6c44-49f1-b128-9593caa6b74c" containerName="manager" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.859976 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.868784 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-index-wln8t"] Mar 11 19:11:05 crc kubenswrapper[4842]: I0311 19:11:05.910515 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69q9d\" (UniqueName: \"kubernetes.io/projected/e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46-kube-api-access-69q9d\") pod \"nova-operator-index-wln8t\" (UID: \"e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46\") " pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.012170 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69q9d\" (UniqueName: \"kubernetes.io/projected/e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46-kube-api-access-69q9d\") pod \"nova-operator-index-wln8t\" (UID: \"e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46\") " pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.044767 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69q9d\" (UniqueName: \"kubernetes.io/projected/e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46-kube-api-access-69q9d\") pod \"nova-operator-index-wln8t\" (UID: \"e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46\") " pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.210867 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.409498 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/nova-operator-index-ff8nw" podUID="e4afa999-7423-47dd-b966-3d81ee2cb0e3" containerName="registry-server" containerID="cri-o://631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c" gracePeriod=2 Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.499548 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-index-wln8t"] Mar 11 19:11:06 crc kubenswrapper[4842]: W0311 19:11:06.510430 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5f44c0d_a601_4f29_a7eb_dc56c3cf3e46.slice/crio-1993f78fdf077952b0ed317212ac52ae19face9ae4a180a5c3ab27b2fd9c1206 WatchSource:0}: Error finding container 1993f78fdf077952b0ed317212ac52ae19face9ae4a180a5c3ab27b2fd9c1206: Status 404 returned error can't find the container with id 1993f78fdf077952b0ed317212ac52ae19face9ae4a180a5c3ab27b2fd9c1206 Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.800544 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-index-ff8nw" Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.824784 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfc9b\" (UniqueName: \"kubernetes.io/projected/e4afa999-7423-47dd-b966-3d81ee2cb0e3-kube-api-access-zfc9b\") pod \"e4afa999-7423-47dd-b966-3d81ee2cb0e3\" (UID: \"e4afa999-7423-47dd-b966-3d81ee2cb0e3\") " Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.831792 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4afa999-7423-47dd-b966-3d81ee2cb0e3-kube-api-access-zfc9b" (OuterVolumeSpecName: "kube-api-access-zfc9b") pod "e4afa999-7423-47dd-b966-3d81ee2cb0e3" (UID: "e4afa999-7423-47dd-b966-3d81ee2cb0e3"). InnerVolumeSpecName "kube-api-access-zfc9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:11:06 crc kubenswrapper[4842]: I0311 19:11:06.926854 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfc9b\" (UniqueName: \"kubernetes.io/projected/e4afa999-7423-47dd-b966-3d81ee2cb0e3-kube-api-access-zfc9b\") on node \"crc\" DevicePath \"\"" Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.419320 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-index-wln8t" event={"ID":"e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46","Type":"ContainerStarted","Data":"79eb112c59b0e6a04d5f7b37aa7def1c0bc062a8e83c1172d5d300c0ce01bb2f"} Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.419370 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-index-wln8t" event={"ID":"e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46","Type":"ContainerStarted","Data":"1993f78fdf077952b0ed317212ac52ae19face9ae4a180a5c3ab27b2fd9c1206"} Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.421332 4842 generic.go:334] "Generic (PLEG): container finished" podID="e4afa999-7423-47dd-b966-3d81ee2cb0e3" containerID="631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c" exitCode=0 Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.421383 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-index-ff8nw" event={"ID":"e4afa999-7423-47dd-b966-3d81ee2cb0e3","Type":"ContainerDied","Data":"631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c"} Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.421436 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-index-ff8nw" Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.421470 4842 scope.go:117] "RemoveContainer" containerID="631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c" Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.421454 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-index-ff8nw" event={"ID":"e4afa999-7423-47dd-b966-3d81ee2cb0e3","Type":"ContainerDied","Data":"9b2cd677c24c76c0ecedf0bfea638383a19120e76a7dff06b0cd84ee0a9e10b7"} Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.459501 4842 scope.go:117] "RemoveContainer" containerID="631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c" Mar 11 19:11:07 crc kubenswrapper[4842]: E0311 19:11:07.460207 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c\": container with ID starting with 631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c not found: ID does not exist" containerID="631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c" Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.460251 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c"} err="failed to get container status \"631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c\": rpc error: code = NotFound desc = could not find container \"631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c\": container with ID starting with 631abbc1356442fd674a5bf2373dceb6239d1cd89d250fe6ebea16f413c7169c not found: ID does not exist" Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.465323 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-index-wln8t" podStartSLOduration=2.4135065239999998 podStartE2EDuration="2.465299655s" podCreationTimestamp="2026-03-11 19:11:05 +0000 UTC" firstStartedPulling="2026-03-11 19:11:06.514328561 +0000 UTC m=+1312.162024831" lastFinishedPulling="2026-03-11 19:11:06.566121682 +0000 UTC m=+1312.213817962" observedRunningTime="2026-03-11 19:11:07.459790487 +0000 UTC m=+1313.107486767" watchObservedRunningTime="2026-03-11 19:11:07.465299655 +0000 UTC m=+1313.112995935" Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.492220 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-index-ff8nw"] Mar 11 19:11:07 crc kubenswrapper[4842]: I0311 19:11:07.500724 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/nova-operator-index-ff8nw"] Mar 11 19:11:08 crc kubenswrapper[4842]: I0311 19:11:08.971573 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4afa999-7423-47dd-b966-3d81ee2cb0e3" path="/var/lib/kubelet/pods/e4afa999-7423-47dd-b966-3d81ee2cb0e3/volumes" Mar 11 19:11:16 crc kubenswrapper[4842]: I0311 19:11:16.211590 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:16 crc kubenswrapper[4842]: I0311 19:11:16.212228 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:16 crc kubenswrapper[4842]: I0311 19:11:16.245855 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:16 crc kubenswrapper[4842]: I0311 19:11:16.530403 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-index-wln8t" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.120283 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq"] Mar 11 19:11:22 crc kubenswrapper[4842]: E0311 19:11:22.121231 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4afa999-7423-47dd-b966-3d81ee2cb0e3" containerName="registry-server" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.121252 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4afa999-7423-47dd-b966-3d81ee2cb0e3" containerName="registry-server" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.121507 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4afa999-7423-47dd-b966-3d81ee2cb0e3" containerName="registry-server" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.124187 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.127438 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pg4cf" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.143732 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq"] Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.324250 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-bundle\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.324372 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn2bp\" (UniqueName: \"kubernetes.io/projected/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-kube-api-access-wn2bp\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.324442 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-util\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.426762 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-bundle\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.426900 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn2bp\" (UniqueName: \"kubernetes.io/projected/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-kube-api-access-wn2bp\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.426978 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-util\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.428627 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-util\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.429494 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-bundle\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.461765 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn2bp\" (UniqueName: \"kubernetes.io/projected/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-kube-api-access-wn2bp\") pod \"9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:22 crc kubenswrapper[4842]: I0311 19:11:22.748917 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:23 crc kubenswrapper[4842]: I0311 19:11:23.228780 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq"] Mar 11 19:11:23 crc kubenswrapper[4842]: I0311 19:11:23.561998 4842 generic.go:334] "Generic (PLEG): container finished" podID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerID="2f9c55fff67e9126495b50d76f72511cbad509b985e7ba2736beadd8ea61178e" exitCode=0 Mar 11 19:11:23 crc kubenswrapper[4842]: I0311 19:11:23.562236 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" event={"ID":"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b","Type":"ContainerDied","Data":"2f9c55fff67e9126495b50d76f72511cbad509b985e7ba2736beadd8ea61178e"} Mar 11 19:11:23 crc kubenswrapper[4842]: I0311 19:11:23.562665 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" event={"ID":"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b","Type":"ContainerStarted","Data":"25f8f9ca11063481ea8c7c8151434c429fd3b36c65b5f122e9ecb45984ba8e5e"} Mar 11 19:11:24 crc kubenswrapper[4842]: I0311 19:11:24.576561 4842 generic.go:334] "Generic (PLEG): container finished" podID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerID="83de3cc310215fcf1b385aa6bbc35c1cb36f47ebef0c1f93a5d255e69512229e" exitCode=0 Mar 11 19:11:24 crc kubenswrapper[4842]: I0311 19:11:24.576624 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" event={"ID":"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b","Type":"ContainerDied","Data":"83de3cc310215fcf1b385aa6bbc35c1cb36f47ebef0c1f93a5d255e69512229e"} Mar 11 19:11:25 crc kubenswrapper[4842]: I0311 19:11:25.591412 4842 generic.go:334] "Generic (PLEG): container finished" podID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerID="43d9df7d5e231db268aa1b8438913cf824e7f0abfab56b66596505d60041f26e" exitCode=0 Mar 11 19:11:25 crc kubenswrapper[4842]: I0311 19:11:25.591471 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" event={"ID":"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b","Type":"ContainerDied","Data":"43d9df7d5e231db268aa1b8438913cf824e7f0abfab56b66596505d60041f26e"} Mar 11 19:11:26 crc kubenswrapper[4842]: I0311 19:11:26.911252 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.101560 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-bundle\") pod \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.102062 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn2bp\" (UniqueName: \"kubernetes.io/projected/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-kube-api-access-wn2bp\") pod \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.102236 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-util\") pod \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\" (UID: \"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b\") " Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.103135 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-bundle" (OuterVolumeSpecName: "bundle") pod "2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" (UID: "2ab92dd8-8fc7-4aa5-b1df-24683fe9360b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.108840 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-kube-api-access-wn2bp" (OuterVolumeSpecName: "kube-api-access-wn2bp") pod "2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" (UID: "2ab92dd8-8fc7-4aa5-b1df-24683fe9360b"). InnerVolumeSpecName "kube-api-access-wn2bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.120806 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-util" (OuterVolumeSpecName: "util") pod "2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" (UID: "2ab92dd8-8fc7-4aa5-b1df-24683fe9360b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.204310 4842 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.204591 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn2bp\" (UniqueName: \"kubernetes.io/projected/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-kube-api-access-wn2bp\") on node \"crc\" DevicePath \"\"" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.204677 4842 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ab92dd8-8fc7-4aa5-b1df-24683fe9360b-util\") on node \"crc\" DevicePath \"\"" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.617735 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" event={"ID":"2ab92dd8-8fc7-4aa5-b1df-24683fe9360b","Type":"ContainerDied","Data":"25f8f9ca11063481ea8c7c8151434c429fd3b36c65b5f122e9ecb45984ba8e5e"} Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.617782 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f8f9ca11063481ea8c7c8151434c429fd3b36c65b5f122e9ecb45984ba8e5e" Mar 11 19:11:27 crc kubenswrapper[4842]: I0311 19:11:27.617857 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq" Mar 11 19:11:31 crc kubenswrapper[4842]: I0311 19:11:31.472074 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:11:31 crc kubenswrapper[4842]: I0311 19:11:31.472566 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.253844 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t"] Mar 11 19:11:39 crc kubenswrapper[4842]: E0311 19:11:39.254756 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerName="util" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.254771 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerName="util" Mar 11 19:11:39 crc kubenswrapper[4842]: E0311 19:11:39.254787 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerName="extract" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.254794 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerName="extract" Mar 11 19:11:39 crc kubenswrapper[4842]: E0311 19:11:39.254802 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerName="pull" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.254808 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerName="pull" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.254949 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab92dd8-8fc7-4aa5-b1df-24683fe9360b" containerName="extract" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.255543 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.258490 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-service-cert" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.258798 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-pncd7" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.276676 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t"] Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.397572 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4b3af5a-7447-41c9-8cc0-5e927157aecf-webhook-cert\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.397804 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c4b3af5a-7447-41c9-8cc0-5e927157aecf-apiservice-cert\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.397903 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z862p\" (UniqueName: \"kubernetes.io/projected/c4b3af5a-7447-41c9-8cc0-5e927157aecf-kube-api-access-z862p\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.499661 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c4b3af5a-7447-41c9-8cc0-5e927157aecf-apiservice-cert\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.499745 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z862p\" (UniqueName: \"kubernetes.io/projected/c4b3af5a-7447-41c9-8cc0-5e927157aecf-kube-api-access-z862p\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.499796 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4b3af5a-7447-41c9-8cc0-5e927157aecf-webhook-cert\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.508498 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c4b3af5a-7447-41c9-8cc0-5e927157aecf-apiservice-cert\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.510123 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c4b3af5a-7447-41c9-8cc0-5e927157aecf-webhook-cert\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.518405 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z862p\" (UniqueName: \"kubernetes.io/projected/c4b3af5a-7447-41c9-8cc0-5e927157aecf-kube-api-access-z862p\") pod \"nova-operator-controller-manager-6f598d9474-l5k2t\" (UID: \"c4b3af5a-7447-41c9-8cc0-5e927157aecf\") " pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:39 crc kubenswrapper[4842]: I0311 19:11:39.582314 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:40 crc kubenswrapper[4842]: I0311 19:11:40.059952 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t"] Mar 11 19:11:40 crc kubenswrapper[4842]: I0311 19:11:40.740582 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" event={"ID":"c4b3af5a-7447-41c9-8cc0-5e927157aecf","Type":"ContainerStarted","Data":"a263fd5ec429f9dc218e3dbec11bdcaf0dbf9f7da05cc5ff4ee39cbf757a4a77"} Mar 11 19:11:40 crc kubenswrapper[4842]: I0311 19:11:40.740953 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" event={"ID":"c4b3af5a-7447-41c9-8cc0-5e927157aecf","Type":"ContainerStarted","Data":"21044bceed03db7e70d3676852ebdc5be351d7e68f132bd2017af9f45f6e618b"} Mar 11 19:11:40 crc kubenswrapper[4842]: I0311 19:11:40.741978 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:49 crc kubenswrapper[4842]: I0311 19:11:49.592069 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" Mar 11 19:11:49 crc kubenswrapper[4842]: I0311 19:11:49.624450 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-6f598d9474-l5k2t" podStartSLOduration=10.624420056 podStartE2EDuration="10.624420056s" podCreationTimestamp="2026-03-11 19:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:11:40.78079267 +0000 UTC m=+1346.428488970" watchObservedRunningTime="2026-03-11 19:11:49.624420056 +0000 UTC m=+1355.272116346" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.143919 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554272-glgns"] Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.145568 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554272-glgns" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.148908 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.149022 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.149593 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.154556 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554272-glgns"] Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.306930 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tt52\" (UniqueName: \"kubernetes.io/projected/f4b2ac53-6114-4e08-9757-e28296a29695-kube-api-access-9tt52\") pod \"auto-csr-approver-29554272-glgns\" (UID: \"f4b2ac53-6114-4e08-9757-e28296a29695\") " pod="openshift-infra/auto-csr-approver-29554272-glgns" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.408331 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tt52\" (UniqueName: \"kubernetes.io/projected/f4b2ac53-6114-4e08-9757-e28296a29695-kube-api-access-9tt52\") pod \"auto-csr-approver-29554272-glgns\" (UID: \"f4b2ac53-6114-4e08-9757-e28296a29695\") " pod="openshift-infra/auto-csr-approver-29554272-glgns" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.426963 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tt52\" (UniqueName: \"kubernetes.io/projected/f4b2ac53-6114-4e08-9757-e28296a29695-kube-api-access-9tt52\") pod \"auto-csr-approver-29554272-glgns\" (UID: \"f4b2ac53-6114-4e08-9757-e28296a29695\") " pod="openshift-infra/auto-csr-approver-29554272-glgns" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.465677 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554272-glgns" Mar 11 19:12:00 crc kubenswrapper[4842]: I0311 19:12:00.939468 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554272-glgns"] Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.472281 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.472617 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.472675 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.473423 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e0132978f744075878dba0b5cc46ab6911da7e9e6a8e99f3a4db40255e33bd4"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.473481 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://0e0132978f744075878dba0b5cc46ab6911da7e9e6a8e99f3a4db40255e33bd4" gracePeriod=600 Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.921078 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554272-glgns" event={"ID":"f4b2ac53-6114-4e08-9757-e28296a29695","Type":"ContainerStarted","Data":"6b34838cc97d89a24f1654816e0e449bd33eb60530eb6dd715c74c45827a2b1d"} Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.925135 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="0e0132978f744075878dba0b5cc46ab6911da7e9e6a8e99f3a4db40255e33bd4" exitCode=0 Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.925188 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"0e0132978f744075878dba0b5cc46ab6911da7e9e6a8e99f3a4db40255e33bd4"} Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.925225 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3"} Mar 11 19:12:01 crc kubenswrapper[4842]: I0311 19:12:01.925295 4842 scope.go:117] "RemoveContainer" containerID="d0ede0d62e1ca5af886d8ad032f52cc79f17aa7c91031e5d4935ed627d33421d" Mar 11 19:12:02 crc kubenswrapper[4842]: I0311 19:12:02.934483 4842 generic.go:334] "Generic (PLEG): container finished" podID="f4b2ac53-6114-4e08-9757-e28296a29695" containerID="edd7b95b78f42abd379e320016674b340cbca802d22798680f08d41a376a0564" exitCode=0 Mar 11 19:12:02 crc kubenswrapper[4842]: I0311 19:12:02.934555 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554272-glgns" event={"ID":"f4b2ac53-6114-4e08-9757-e28296a29695","Type":"ContainerDied","Data":"edd7b95b78f42abd379e320016674b340cbca802d22798680f08d41a376a0564"} Mar 11 19:12:04 crc kubenswrapper[4842]: I0311 19:12:04.280824 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554272-glgns" Mar 11 19:12:04 crc kubenswrapper[4842]: I0311 19:12:04.383835 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tt52\" (UniqueName: \"kubernetes.io/projected/f4b2ac53-6114-4e08-9757-e28296a29695-kube-api-access-9tt52\") pod \"f4b2ac53-6114-4e08-9757-e28296a29695\" (UID: \"f4b2ac53-6114-4e08-9757-e28296a29695\") " Mar 11 19:12:04 crc kubenswrapper[4842]: I0311 19:12:04.390422 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b2ac53-6114-4e08-9757-e28296a29695-kube-api-access-9tt52" (OuterVolumeSpecName: "kube-api-access-9tt52") pod "f4b2ac53-6114-4e08-9757-e28296a29695" (UID: "f4b2ac53-6114-4e08-9757-e28296a29695"). InnerVolumeSpecName "kube-api-access-9tt52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:04 crc kubenswrapper[4842]: I0311 19:12:04.486305 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tt52\" (UniqueName: \"kubernetes.io/projected/f4b2ac53-6114-4e08-9757-e28296a29695-kube-api-access-9tt52\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:04 crc kubenswrapper[4842]: I0311 19:12:04.959295 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554272-glgns" event={"ID":"f4b2ac53-6114-4e08-9757-e28296a29695","Type":"ContainerDied","Data":"6b34838cc97d89a24f1654816e0e449bd33eb60530eb6dd715c74c45827a2b1d"} Mar 11 19:12:04 crc kubenswrapper[4842]: I0311 19:12:04.959335 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b34838cc97d89a24f1654816e0e449bd33eb60530eb6dd715c74c45827a2b1d" Mar 11 19:12:04 crc kubenswrapper[4842]: I0311 19:12:04.959378 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554272-glgns" Mar 11 19:12:05 crc kubenswrapper[4842]: I0311 19:12:05.340355 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554266-x7vxv"] Mar 11 19:12:05 crc kubenswrapper[4842]: I0311 19:12:05.348547 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554266-x7vxv"] Mar 11 19:12:06 crc kubenswrapper[4842]: I0311 19:12:06.976921 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07" path="/var/lib/kubelet/pods/f9b6ad7f-8d2c-4197-a3b8-71a89c3a9d07/volumes" Mar 11 19:12:16 crc kubenswrapper[4842]: I0311 19:12:16.974143 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-db-create-p2t4f"] Mar 11 19:12:16 crc kubenswrapper[4842]: E0311 19:12:16.974867 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b2ac53-6114-4e08-9757-e28296a29695" containerName="oc" Mar 11 19:12:16 crc kubenswrapper[4842]: I0311 19:12:16.974882 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b2ac53-6114-4e08-9757-e28296a29695" containerName="oc" Mar 11 19:12:16 crc kubenswrapper[4842]: I0311 19:12:16.975060 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b2ac53-6114-4e08-9757-e28296a29695" containerName="oc" Mar 11 19:12:16 crc kubenswrapper[4842]: I0311 19:12:16.975632 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:16 crc kubenswrapper[4842]: I0311 19:12:16.996553 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-p2t4f"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.052501 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-m8r7v"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.053560 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.064511 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8j7c\" (UniqueName: \"kubernetes.io/projected/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-kube-api-access-r8j7c\") pod \"nova-api-db-create-p2t4f\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.064556 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb473bb-9bf7-4bab-91e8-eef4e6931322-operator-scripts\") pod \"nova-cell0-db-create-m8r7v\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.064615 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-operator-scripts\") pod \"nova-api-db-create-p2t4f\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.064632 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jttw8\" (UniqueName: \"kubernetes.io/projected/efb473bb-9bf7-4bab-91e8-eef4e6931322-kube-api-access-jttw8\") pod \"nova-cell0-db-create-m8r7v\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.066350 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-m8r7v"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.154555 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-3968-account-create-update-v22xt"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.155537 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.159180 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-api-db-secret" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.166068 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-operator-scripts\") pod \"nova-api-db-create-p2t4f\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.166120 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jttw8\" (UniqueName: \"kubernetes.io/projected/efb473bb-9bf7-4bab-91e8-eef4e6931322-kube-api-access-jttw8\") pod \"nova-cell0-db-create-m8r7v\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.166199 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-operator-scripts\") pod \"nova-api-3968-account-create-update-v22xt\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.166259 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8j7c\" (UniqueName: \"kubernetes.io/projected/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-kube-api-access-r8j7c\") pod \"nova-api-db-create-p2t4f\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.166300 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb473bb-9bf7-4bab-91e8-eef4e6931322-operator-scripts\") pod \"nova-cell0-db-create-m8r7v\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.166322 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hsv6\" (UniqueName: \"kubernetes.io/projected/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-kube-api-access-7hsv6\") pod \"nova-api-3968-account-create-update-v22xt\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.167069 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-operator-scripts\") pod \"nova-api-db-create-p2t4f\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.167081 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb473bb-9bf7-4bab-91e8-eef4e6931322-operator-scripts\") pod \"nova-cell0-db-create-m8r7v\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.169976 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-3968-account-create-update-v22xt"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.187855 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jttw8\" (UniqueName: \"kubernetes.io/projected/efb473bb-9bf7-4bab-91e8-eef4e6931322-kube-api-access-jttw8\") pod \"nova-cell0-db-create-m8r7v\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.202563 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8j7c\" (UniqueName: \"kubernetes.io/projected/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-kube-api-access-r8j7c\") pod \"nova-api-db-create-p2t4f\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.252451 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-kx6kb"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.253860 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.261422 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-kx6kb"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.267570 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-operator-scripts\") pod \"nova-cell1-db-create-kx6kb\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.267704 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-operator-scripts\") pod \"nova-api-3968-account-create-update-v22xt\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.267874 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hsv6\" (UniqueName: \"kubernetes.io/projected/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-kube-api-access-7hsv6\") pod \"nova-api-3968-account-create-update-v22xt\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.267991 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22v4\" (UniqueName: \"kubernetes.io/projected/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-kube-api-access-s22v4\") pod \"nova-cell1-db-create-kx6kb\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.269101 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-operator-scripts\") pod \"nova-api-3968-account-create-update-v22xt\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.286656 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hsv6\" (UniqueName: \"kubernetes.io/projected/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-kube-api-access-7hsv6\") pod \"nova-api-3968-account-create-update-v22xt\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.294785 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.359710 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.361097 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.363726 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell0-db-secret" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.369608 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.369712 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbb76\" (UniqueName: \"kubernetes.io/projected/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-kube-api-access-qbb76\") pod \"nova-cell0-a055-account-create-update-g92kl\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.369800 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s22v4\" (UniqueName: \"kubernetes.io/projected/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-kube-api-access-s22v4\") pod \"nova-cell1-db-create-kx6kb\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.370001 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-operator-scripts\") pod \"nova-cell1-db-create-kx6kb\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.370040 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-operator-scripts\") pod \"nova-cell0-a055-account-create-update-g92kl\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.370977 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-operator-scripts\") pod \"nova-cell1-db-create-kx6kb\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.375922 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.397488 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22v4\" (UniqueName: \"kubernetes.io/projected/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-kube-api-access-s22v4\") pod \"nova-cell1-db-create-kx6kb\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.470848 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbb76\" (UniqueName: \"kubernetes.io/projected/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-kube-api-access-qbb76\") pod \"nova-cell0-a055-account-create-update-g92kl\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.471016 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-operator-scripts\") pod \"nova-cell0-a055-account-create-update-g92kl\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.472170 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-operator-scripts\") pod \"nova-cell0-a055-account-create-update-g92kl\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.474724 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.493531 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbb76\" (UniqueName: \"kubernetes.io/projected/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-kube-api-access-qbb76\") pod \"nova-cell0-a055-account-create-update-g92kl\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.562003 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.563549 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.565765 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell1-db-secret" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.571063 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.571974 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af8fbb24-c60f-4143-8811-f56af287ace2-operator-scripts\") pod \"nova-cell1-b521-account-create-update-8tl7p\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.572046 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrz96\" (UniqueName: \"kubernetes.io/projected/af8fbb24-c60f-4143-8811-f56af287ace2-kube-api-access-wrz96\") pod \"nova-cell1-b521-account-create-update-8tl7p\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.599154 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.673183 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrz96\" (UniqueName: \"kubernetes.io/projected/af8fbb24-c60f-4143-8811-f56af287ace2-kube-api-access-wrz96\") pod \"nova-cell1-b521-account-create-update-8tl7p\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.673301 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af8fbb24-c60f-4143-8811-f56af287ace2-operator-scripts\") pod \"nova-cell1-b521-account-create-update-8tl7p\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.673983 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af8fbb24-c60f-4143-8811-f56af287ace2-operator-scripts\") pod \"nova-cell1-b521-account-create-update-8tl7p\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.693210 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrz96\" (UniqueName: \"kubernetes.io/projected/af8fbb24-c60f-4143-8811-f56af287ace2-kube-api-access-wrz96\") pod \"nova-cell1-b521-account-create-update-8tl7p\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.704428 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:17 crc kubenswrapper[4842]: W0311 19:12:17.861626 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda0cc46d_a148_42b6_a184_8cbd5e5c14e4.slice/crio-2aa1aa543e5794ae508d6c66891b9c5dbf173f8990864a1fa68cca78c4fd5dc4 WatchSource:0}: Error finding container 2aa1aa543e5794ae508d6c66891b9c5dbf173f8990864a1fa68cca78c4fd5dc4: Status 404 returned error can't find the container with id 2aa1aa543e5794ae508d6c66891b9c5dbf173f8990864a1fa68cca78c4fd5dc4 Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.862188 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-p2t4f"] Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.884020 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:17 crc kubenswrapper[4842]: I0311 19:12:17.993562 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-m8r7v"] Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.044830 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl"] Mar 11 19:12:18 crc kubenswrapper[4842]: W0311 19:12:18.075131 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1ee4193_bc4e_4684_8ce1_92c4db5864f2.slice/crio-12e1f61df47a346a48f3ed32d0ae57bc5106703490aa8d9c97a6bb3c5c6a7b7c WatchSource:0}: Error finding container 12e1f61df47a346a48f3ed32d0ae57bc5106703490aa8d9c97a6bb3c5c6a7b7c: Status 404 returned error can't find the container with id 12e1f61df47a346a48f3ed32d0ae57bc5106703490aa8d9c97a6bb3c5c6a7b7c Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.077867 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-3968-account-create-update-v22xt"] Mar 11 19:12:18 crc kubenswrapper[4842]: W0311 19:12:18.088502 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda909f9f5_c1ce_437f_a60c_3f5e73fd5f40.slice/crio-11c8bd5d5fdcea26fa0f2e33b83931ee909da106ebaccc8cc57f6bb1b62265b0 WatchSource:0}: Error finding container 11c8bd5d5fdcea26fa0f2e33b83931ee909da106ebaccc8cc57f6bb1b62265b0: Status 404 returned error can't find the container with id 11c8bd5d5fdcea26fa0f2e33b83931ee909da106ebaccc8cc57f6bb1b62265b0 Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.092834 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-p2t4f" event={"ID":"da0cc46d-a148-42b6-a184-8cbd5e5c14e4","Type":"ContainerStarted","Data":"a63c3238bfac46b9c23626d6ade9d53415f70c92a5e0fee91287df2d9fc57637"} Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.092884 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-p2t4f" event={"ID":"da0cc46d-a148-42b6-a184-8cbd5e5c14e4","Type":"ContainerStarted","Data":"2aa1aa543e5794ae508d6c66891b9c5dbf173f8990864a1fa68cca78c4fd5dc4"} Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.097734 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" event={"ID":"efb473bb-9bf7-4bab-91e8-eef4e6931322","Type":"ContainerStarted","Data":"0f5443096927e8b65517b136ab3aeac820c82ea14cb8c48d1f4cb2862eef7918"} Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.108815 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-api-db-create-p2t4f" podStartSLOduration=2.108802062 podStartE2EDuration="2.108802062s" podCreationTimestamp="2026-03-11 19:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:12:18.106798208 +0000 UTC m=+1383.754494478" watchObservedRunningTime="2026-03-11 19:12:18.108802062 +0000 UTC m=+1383.756498332" Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.161520 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-kx6kb"] Mar 11 19:12:18 crc kubenswrapper[4842]: I0311 19:12:18.356170 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p"] Mar 11 19:12:18 crc kubenswrapper[4842]: W0311 19:12:18.374130 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf8fbb24_c60f_4143_8811_f56af287ace2.slice/crio-35a83d3a94d47ca09e7c78c9d5a664499e15b9f406eaeea9aa4842a7fe079779 WatchSource:0}: Error finding container 35a83d3a94d47ca09e7c78c9d5a664499e15b9f406eaeea9aa4842a7fe079779: Status 404 returned error can't find the container with id 35a83d3a94d47ca09e7c78c9d5a664499e15b9f406eaeea9aa4842a7fe079779 Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.109343 4842 generic.go:334] "Generic (PLEG): container finished" podID="6a073ef4-9c1e-481a-aa9a-405e4892e3ef" containerID="53b16f0eb12ca94a03d896c9d4d12003a1b24c2fe96715ffa4b8e7263c1ec4e1" exitCode=0 Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.109455 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" event={"ID":"6a073ef4-9c1e-481a-aa9a-405e4892e3ef","Type":"ContainerDied","Data":"53b16f0eb12ca94a03d896c9d4d12003a1b24c2fe96715ffa4b8e7263c1ec4e1"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.109778 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" event={"ID":"6a073ef4-9c1e-481a-aa9a-405e4892e3ef","Type":"ContainerStarted","Data":"cd6bd4c54b09bc3d238b9bb08e9522a3f658ef071bb4e07c1237f09c7f3727f7"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.110935 4842 generic.go:334] "Generic (PLEG): container finished" podID="efb473bb-9bf7-4bab-91e8-eef4e6931322" containerID="657aec834e4d5c4a396ecde7752aeb66b8f6a57912102f54ece9a6d9b736589a" exitCode=0 Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.110996 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" event={"ID":"efb473bb-9bf7-4bab-91e8-eef4e6931322","Type":"ContainerDied","Data":"657aec834e4d5c4a396ecde7752aeb66b8f6a57912102f54ece9a6d9b736589a"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.113045 4842 generic.go:334] "Generic (PLEG): container finished" podID="a909f9f5-c1ce-437f-a60c-3f5e73fd5f40" containerID="b48db79b04ca4e83ed26998ec2e6a74da2495699c80c59933de44a3b7f18ab67" exitCode=0 Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.113199 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" event={"ID":"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40","Type":"ContainerDied","Data":"b48db79b04ca4e83ed26998ec2e6a74da2495699c80c59933de44a3b7f18ab67"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.113243 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" event={"ID":"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40","Type":"ContainerStarted","Data":"11c8bd5d5fdcea26fa0f2e33b83931ee909da106ebaccc8cc57f6bb1b62265b0"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.115692 4842 generic.go:334] "Generic (PLEG): container finished" podID="da0cc46d-a148-42b6-a184-8cbd5e5c14e4" containerID="a63c3238bfac46b9c23626d6ade9d53415f70c92a5e0fee91287df2d9fc57637" exitCode=0 Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.115737 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-p2t4f" event={"ID":"da0cc46d-a148-42b6-a184-8cbd5e5c14e4","Type":"ContainerDied","Data":"a63c3238bfac46b9c23626d6ade9d53415f70c92a5e0fee91287df2d9fc57637"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.118370 4842 generic.go:334] "Generic (PLEG): container finished" podID="af8fbb24-c60f-4143-8811-f56af287ace2" containerID="5b7f363886c60e12a093cfa74d7e5d9a0d20de1fab9acfb0f8360ffd24fe569a" exitCode=0 Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.118487 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" event={"ID":"af8fbb24-c60f-4143-8811-f56af287ace2","Type":"ContainerDied","Data":"5b7f363886c60e12a093cfa74d7e5d9a0d20de1fab9acfb0f8360ffd24fe569a"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.118533 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" event={"ID":"af8fbb24-c60f-4143-8811-f56af287ace2","Type":"ContainerStarted","Data":"35a83d3a94d47ca09e7c78c9d5a664499e15b9f406eaeea9aa4842a7fe079779"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.123324 4842 generic.go:334] "Generic (PLEG): container finished" podID="c1ee4193-bc4e-4684-8ce1-92c4db5864f2" containerID="4e686365a3ab688a6aab2ab926145acecb1fa50309580ebea4d78a1a4eb0d505" exitCode=0 Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.123387 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" event={"ID":"c1ee4193-bc4e-4684-8ce1-92c4db5864f2","Type":"ContainerDied","Data":"4e686365a3ab688a6aab2ab926145acecb1fa50309580ebea4d78a1a4eb0d505"} Mar 11 19:12:19 crc kubenswrapper[4842]: I0311 19:12:19.123415 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" event={"ID":"c1ee4193-bc4e-4684-8ce1-92c4db5864f2","Type":"ContainerStarted","Data":"12e1f61df47a346a48f3ed32d0ae57bc5106703490aa8d9c97a6bb3c5c6a7b7c"} Mar 11 19:12:20 crc kubenswrapper[4842]: I0311 19:12:20.685459 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:20 crc kubenswrapper[4842]: I0311 19:12:20.843220 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrz96\" (UniqueName: \"kubernetes.io/projected/af8fbb24-c60f-4143-8811-f56af287ace2-kube-api-access-wrz96\") pod \"af8fbb24-c60f-4143-8811-f56af287ace2\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " Mar 11 19:12:20 crc kubenswrapper[4842]: I0311 19:12:20.843420 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af8fbb24-c60f-4143-8811-f56af287ace2-operator-scripts\") pod \"af8fbb24-c60f-4143-8811-f56af287ace2\" (UID: \"af8fbb24-c60f-4143-8811-f56af287ace2\") " Mar 11 19:12:20 crc kubenswrapper[4842]: I0311 19:12:20.844556 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af8fbb24-c60f-4143-8811-f56af287ace2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af8fbb24-c60f-4143-8811-f56af287ace2" (UID: "af8fbb24-c60f-4143-8811-f56af287ace2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:12:20 crc kubenswrapper[4842]: I0311 19:12:20.860708 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8fbb24-c60f-4143-8811-f56af287ace2-kube-api-access-wrz96" (OuterVolumeSpecName: "kube-api-access-wrz96") pod "af8fbb24-c60f-4143-8811-f56af287ace2" (UID: "af8fbb24-c60f-4143-8811-f56af287ace2"). InnerVolumeSpecName "kube-api-access-wrz96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:20 crc kubenswrapper[4842]: I0311 19:12:20.946242 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af8fbb24-c60f-4143-8811-f56af287ace2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:20 crc kubenswrapper[4842]: I0311 19:12:20.946835 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrz96\" (UniqueName: \"kubernetes.io/projected/af8fbb24-c60f-4143-8811-f56af287ace2-kube-api-access-wrz96\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.035207 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.040014 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.047119 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.071314 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.075319 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.152519 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s22v4\" (UniqueName: \"kubernetes.io/projected/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-kube-api-access-s22v4\") pod \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.152590 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-operator-scripts\") pod \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.152668 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-operator-scripts\") pod \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.152728 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-operator-scripts\") pod \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\" (UID: \"6a073ef4-9c1e-481a-aa9a-405e4892e3ef\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.152744 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbb76\" (UniqueName: \"kubernetes.io/projected/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-kube-api-access-qbb76\") pod \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\" (UID: \"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.152783 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8j7c\" (UniqueName: \"kubernetes.io/projected/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-kube-api-access-r8j7c\") pod \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\" (UID: \"da0cc46d-a148-42b6-a184-8cbd5e5c14e4\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.154707 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" event={"ID":"af8fbb24-c60f-4143-8811-f56af287ace2","Type":"ContainerDied","Data":"35a83d3a94d47ca09e7c78c9d5a664499e15b9f406eaeea9aa4842a7fe079779"} Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.155774 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a83d3a94d47ca09e7c78c9d5a664499e15b9f406eaeea9aa4842a7fe079779" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.155898 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.156773 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a909f9f5-c1ce-437f-a60c-3f5e73fd5f40" (UID: "a909f9f5-c1ce-437f-a60c-3f5e73fd5f40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.157349 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da0cc46d-a148-42b6-a184-8cbd5e5c14e4" (UID: "da0cc46d-a148-42b6-a184-8cbd5e5c14e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.158715 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a073ef4-9c1e-481a-aa9a-405e4892e3ef" (UID: "6a073ef4-9c1e-481a-aa9a-405e4892e3ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.161875 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-kube-api-access-qbb76" (OuterVolumeSpecName: "kube-api-access-qbb76") pod "a909f9f5-c1ce-437f-a60c-3f5e73fd5f40" (UID: "a909f9f5-c1ce-437f-a60c-3f5e73fd5f40"). InnerVolumeSpecName "kube-api-access-qbb76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.163157 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" event={"ID":"c1ee4193-bc4e-4684-8ce1-92c4db5864f2","Type":"ContainerDied","Data":"12e1f61df47a346a48f3ed32d0ae57bc5106703490aa8d9c97a6bb3c5c6a7b7c"} Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.163230 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e1f61df47a346a48f3ed32d0ae57bc5106703490aa8d9c97a6bb3c5c6a7b7c" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.163756 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-kube-api-access-r8j7c" (OuterVolumeSpecName: "kube-api-access-r8j7c") pod "da0cc46d-a148-42b6-a184-8cbd5e5c14e4" (UID: "da0cc46d-a148-42b6-a184-8cbd5e5c14e4"). InnerVolumeSpecName "kube-api-access-r8j7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.163847 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-kube-api-access-s22v4" (OuterVolumeSpecName: "kube-api-access-s22v4") pod "6a073ef4-9c1e-481a-aa9a-405e4892e3ef" (UID: "6a073ef4-9c1e-481a-aa9a-405e4892e3ef"). InnerVolumeSpecName "kube-api-access-s22v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.164326 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-3968-account-create-update-v22xt" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.171554 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.171764 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-m8r7v" event={"ID":"efb473bb-9bf7-4bab-91e8-eef4e6931322","Type":"ContainerDied","Data":"0f5443096927e8b65517b136ab3aeac820c82ea14cb8c48d1f4cb2862eef7918"} Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.171803 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f5443096927e8b65517b136ab3aeac820c82ea14cb8c48d1f4cb2862eef7918" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.177246 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.177551 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-kx6kb" event={"ID":"6a073ef4-9c1e-481a-aa9a-405e4892e3ef","Type":"ContainerDied","Data":"cd6bd4c54b09bc3d238b9bb08e9522a3f658ef071bb4e07c1237f09c7f3727f7"} Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.177627 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd6bd4c54b09bc3d238b9bb08e9522a3f658ef071bb4e07c1237f09c7f3727f7" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.181944 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.182015 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl" event={"ID":"a909f9f5-c1ce-437f-a60c-3f5e73fd5f40","Type":"ContainerDied","Data":"11c8bd5d5fdcea26fa0f2e33b83931ee909da106ebaccc8cc57f6bb1b62265b0"} Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.182265 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11c8bd5d5fdcea26fa0f2e33b83931ee909da106ebaccc8cc57f6bb1b62265b0" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.187206 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-p2t4f" event={"ID":"da0cc46d-a148-42b6-a184-8cbd5e5c14e4","Type":"ContainerDied","Data":"2aa1aa543e5794ae508d6c66891b9c5dbf173f8990864a1fa68cca78c4fd5dc4"} Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.187234 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa1aa543e5794ae508d6c66891b9c5dbf173f8990864a1fa68cca78c4fd5dc4" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.187417 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-p2t4f" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.254928 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-operator-scripts\") pod \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.254986 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb473bb-9bf7-4bab-91e8-eef4e6931322-operator-scripts\") pod \"efb473bb-9bf7-4bab-91e8-eef4e6931322\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255006 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jttw8\" (UniqueName: \"kubernetes.io/projected/efb473bb-9bf7-4bab-91e8-eef4e6931322-kube-api-access-jttw8\") pod \"efb473bb-9bf7-4bab-91e8-eef4e6931322\" (UID: \"efb473bb-9bf7-4bab-91e8-eef4e6931322\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255073 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hsv6\" (UniqueName: \"kubernetes.io/projected/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-kube-api-access-7hsv6\") pod \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\" (UID: \"c1ee4193-bc4e-4684-8ce1-92c4db5864f2\") " Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255438 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255455 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255465 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbb76\" (UniqueName: \"kubernetes.io/projected/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40-kube-api-access-qbb76\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255477 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255486 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8j7c\" (UniqueName: \"kubernetes.io/projected/da0cc46d-a148-42b6-a184-8cbd5e5c14e4-kube-api-access-r8j7c\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255495 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s22v4\" (UniqueName: \"kubernetes.io/projected/6a073ef4-9c1e-481a-aa9a-405e4892e3ef-kube-api-access-s22v4\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.255654 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1ee4193-bc4e-4684-8ce1-92c4db5864f2" (UID: "c1ee4193-bc4e-4684-8ce1-92c4db5864f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.256001 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb473bb-9bf7-4bab-91e8-eef4e6931322-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efb473bb-9bf7-4bab-91e8-eef4e6931322" (UID: "efb473bb-9bf7-4bab-91e8-eef4e6931322"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.259337 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-kube-api-access-7hsv6" (OuterVolumeSpecName: "kube-api-access-7hsv6") pod "c1ee4193-bc4e-4684-8ce1-92c4db5864f2" (UID: "c1ee4193-bc4e-4684-8ce1-92c4db5864f2"). InnerVolumeSpecName "kube-api-access-7hsv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.260097 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb473bb-9bf7-4bab-91e8-eef4e6931322-kube-api-access-jttw8" (OuterVolumeSpecName: "kube-api-access-jttw8") pod "efb473bb-9bf7-4bab-91e8-eef4e6931322" (UID: "efb473bb-9bf7-4bab-91e8-eef4e6931322"). InnerVolumeSpecName "kube-api-access-jttw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.357423 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.357458 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb473bb-9bf7-4bab-91e8-eef4e6931322-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.357468 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jttw8\" (UniqueName: \"kubernetes.io/projected/efb473bb-9bf7-4bab-91e8-eef4e6931322-kube-api-access-jttw8\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:21 crc kubenswrapper[4842]: I0311 19:12:21.357480 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hsv6\" (UniqueName: \"kubernetes.io/projected/c1ee4193-bc4e-4684-8ce1-92c4db5864f2-kube-api-access-7hsv6\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.189648 4842 scope.go:117] "RemoveContainer" containerID="2bbe9e150869057fecfd22614ea7ace399559e54340b1bcac4f70859b35233a2" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.356565 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf"] Mar 11 19:12:27 crc kubenswrapper[4842]: E0311 19:12:27.356943 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ee4193-bc4e-4684-8ce1-92c4db5864f2" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.356965 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ee4193-bc4e-4684-8ce1-92c4db5864f2" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: E0311 19:12:27.356982 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a909f9f5-c1ce-437f-a60c-3f5e73fd5f40" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.356992 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a909f9f5-c1ce-437f-a60c-3f5e73fd5f40" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: E0311 19:12:27.357008 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb473bb-9bf7-4bab-91e8-eef4e6931322" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357017 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb473bb-9bf7-4bab-91e8-eef4e6931322" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: E0311 19:12:27.357036 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a073ef4-9c1e-481a-aa9a-405e4892e3ef" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357046 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a073ef4-9c1e-481a-aa9a-405e4892e3ef" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: E0311 19:12:27.357066 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0cc46d-a148-42b6-a184-8cbd5e5c14e4" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357076 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0cc46d-a148-42b6-a184-8cbd5e5c14e4" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: E0311 19:12:27.357092 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8fbb24-c60f-4143-8811-f56af287ace2" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357101 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8fbb24-c60f-4143-8811-f56af287ace2" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357310 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb473bb-9bf7-4bab-91e8-eef4e6931322" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357326 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a073ef4-9c1e-481a-aa9a-405e4892e3ef" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357342 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8fbb24-c60f-4143-8811-f56af287ace2" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357360 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ee4193-bc4e-4684-8ce1-92c4db5864f2" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357371 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a909f9f5-c1ce-437f-a60c-3f5e73fd5f40" containerName="mariadb-account-create-update" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357381 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0cc46d-a148-42b6-a184-8cbd5e5c14e4" containerName="mariadb-database-create" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.357993 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.366948 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf"] Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.368572 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.372817 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-kvrtt" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.373127 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-scripts" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.468567 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.468619 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.468656 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlw5r\" (UniqueName: \"kubernetes.io/projected/1135b7fc-e609-4d5c-8301-2606cf886c49-kube-api-access-hlw5r\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.570091 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.570139 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.570172 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlw5r\" (UniqueName: \"kubernetes.io/projected/1135b7fc-e609-4d5c-8301-2606cf886c49-kube-api-access-hlw5r\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.576501 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.576569 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.587390 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlw5r\" (UniqueName: \"kubernetes.io/projected/1135b7fc-e609-4d5c-8301-2606cf886c49-kube-api-access-hlw5r\") pod \"nova-kuttl-cell0-conductor-db-sync-76jzf\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:27 crc kubenswrapper[4842]: I0311 19:12:27.677992 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:28 crc kubenswrapper[4842]: I0311 19:12:28.169689 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf"] Mar 11 19:12:28 crc kubenswrapper[4842]: I0311 19:12:28.287752 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" event={"ID":"1135b7fc-e609-4d5c-8301-2606cf886c49","Type":"ContainerStarted","Data":"035506c7c920131e06a18aeacd7a1e8bbb43a874669ace5b384febb778e25ac6"} Mar 11 19:12:36 crc kubenswrapper[4842]: I0311 19:12:36.378598 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" event={"ID":"1135b7fc-e609-4d5c-8301-2606cf886c49","Type":"ContainerStarted","Data":"31bd540836d1ecfb461194fee32946cce88c1caa154bb283417af56ab14b0b8e"} Mar 11 19:12:36 crc kubenswrapper[4842]: I0311 19:12:36.400261 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" podStartSLOduration=1.6650994190000001 podStartE2EDuration="9.400229291s" podCreationTimestamp="2026-03-11 19:12:27 +0000 UTC" firstStartedPulling="2026-03-11 19:12:28.174473279 +0000 UTC m=+1393.822169559" lastFinishedPulling="2026-03-11 19:12:35.909603151 +0000 UTC m=+1401.557299431" observedRunningTime="2026-03-11 19:12:36.395879234 +0000 UTC m=+1402.043575534" watchObservedRunningTime="2026-03-11 19:12:36.400229291 +0000 UTC m=+1402.047925571" Mar 11 19:12:46 crc kubenswrapper[4842]: I0311 19:12:46.493303 4842 generic.go:334] "Generic (PLEG): container finished" podID="1135b7fc-e609-4d5c-8301-2606cf886c49" containerID="31bd540836d1ecfb461194fee32946cce88c1caa154bb283417af56ab14b0b8e" exitCode=0 Mar 11 19:12:46 crc kubenswrapper[4842]: I0311 19:12:46.493437 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" event={"ID":"1135b7fc-e609-4d5c-8301-2606cf886c49","Type":"ContainerDied","Data":"31bd540836d1ecfb461194fee32946cce88c1caa154bb283417af56ab14b0b8e"} Mar 11 19:12:47 crc kubenswrapper[4842]: I0311 19:12:47.797576 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:47 crc kubenswrapper[4842]: I0311 19:12:47.953610 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlw5r\" (UniqueName: \"kubernetes.io/projected/1135b7fc-e609-4d5c-8301-2606cf886c49-kube-api-access-hlw5r\") pod \"1135b7fc-e609-4d5c-8301-2606cf886c49\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " Mar 11 19:12:47 crc kubenswrapper[4842]: I0311 19:12:47.953678 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-scripts\") pod \"1135b7fc-e609-4d5c-8301-2606cf886c49\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " Mar 11 19:12:47 crc kubenswrapper[4842]: I0311 19:12:47.953715 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-config-data\") pod \"1135b7fc-e609-4d5c-8301-2606cf886c49\" (UID: \"1135b7fc-e609-4d5c-8301-2606cf886c49\") " Mar 11 19:12:47 crc kubenswrapper[4842]: I0311 19:12:47.961918 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-scripts" (OuterVolumeSpecName: "scripts") pod "1135b7fc-e609-4d5c-8301-2606cf886c49" (UID: "1135b7fc-e609-4d5c-8301-2606cf886c49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:12:47 crc kubenswrapper[4842]: I0311 19:12:47.965457 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1135b7fc-e609-4d5c-8301-2606cf886c49-kube-api-access-hlw5r" (OuterVolumeSpecName: "kube-api-access-hlw5r") pod "1135b7fc-e609-4d5c-8301-2606cf886c49" (UID: "1135b7fc-e609-4d5c-8301-2606cf886c49"). InnerVolumeSpecName "kube-api-access-hlw5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:12:47 crc kubenswrapper[4842]: I0311 19:12:47.984372 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-config-data" (OuterVolumeSpecName: "config-data") pod "1135b7fc-e609-4d5c-8301-2606cf886c49" (UID: "1135b7fc-e609-4d5c-8301-2606cf886c49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.056886 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.056938 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlw5r\" (UniqueName: \"kubernetes.io/projected/1135b7fc-e609-4d5c-8301-2606cf886c49-kube-api-access-hlw5r\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.056952 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1135b7fc-e609-4d5c-8301-2606cf886c49-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.517663 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" event={"ID":"1135b7fc-e609-4d5c-8301-2606cf886c49","Type":"ContainerDied","Data":"035506c7c920131e06a18aeacd7a1e8bbb43a874669ace5b384febb778e25ac6"} Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.517901 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035506c7c920131e06a18aeacd7a1e8bbb43a874669ace5b384febb778e25ac6" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.517829 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.622658 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:12:48 crc kubenswrapper[4842]: E0311 19:12:48.623098 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1135b7fc-e609-4d5c-8301-2606cf886c49" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.623117 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1135b7fc-e609-4d5c-8301-2606cf886c49" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.623350 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1135b7fc-e609-4d5c-8301-2606cf886c49" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.624047 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.628051 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-kvrtt" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.628587 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.633309 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.768189 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7bsr\" (UniqueName: \"kubernetes.io/projected/9e19bc52-96e1-47ba-83de-cdad48efca4f-kube-api-access-q7bsr\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.768507 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e19bc52-96e1-47ba-83de-cdad48efca4f-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.871142 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7bsr\" (UniqueName: \"kubernetes.io/projected/9e19bc52-96e1-47ba-83de-cdad48efca4f-kube-api-access-q7bsr\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.871339 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e19bc52-96e1-47ba-83de-cdad48efca4f-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.878941 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e19bc52-96e1-47ba-83de-cdad48efca4f-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.888012 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7bsr\" (UniqueName: \"kubernetes.io/projected/9e19bc52-96e1-47ba-83de-cdad48efca4f-kube-api-access-q7bsr\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:48 crc kubenswrapper[4842]: I0311 19:12:48.942456 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:49 crc kubenswrapper[4842]: I0311 19:12:49.407626 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:12:49 crc kubenswrapper[4842]: I0311 19:12:49.532748 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9e19bc52-96e1-47ba-83de-cdad48efca4f","Type":"ContainerStarted","Data":"e1463ea59eb2972732082efc7919fb73aca7f7f970788fdf7fc873f1e5cb52ab"} Mar 11 19:12:50 crc kubenswrapper[4842]: I0311 19:12:50.543816 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9e19bc52-96e1-47ba-83de-cdad48efca4f","Type":"ContainerStarted","Data":"537a24a5bb0b6e011b4c285f435d493276c1151c3a90489dd7a5617c9dc3402f"} Mar 11 19:12:50 crc kubenswrapper[4842]: I0311 19:12:50.543975 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:50 crc kubenswrapper[4842]: I0311 19:12:50.562978 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=2.562955676 podStartE2EDuration="2.562955676s" podCreationTimestamp="2026-03-11 19:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:12:50.562435972 +0000 UTC m=+1416.210132252" watchObservedRunningTime="2026-03-11 19:12:50.562955676 +0000 UTC m=+1416.210651956" Mar 11 19:12:58 crc kubenswrapper[4842]: I0311 19:12:58.983986 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.486103 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh"] Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.488865 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.491831 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-scripts" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.496265 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-config-data" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.505987 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh"] Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.612334 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flpbv\" (UniqueName: \"kubernetes.io/projected/452cba63-3d2d-41ae-a655-2f9b6cf932e9-kube-api-access-flpbv\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.612855 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-scripts\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.613046 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-config-data\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.714710 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flpbv\" (UniqueName: \"kubernetes.io/projected/452cba63-3d2d-41ae-a655-2f9b6cf932e9-kube-api-access-flpbv\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.714814 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-scripts\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.714961 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-config-data\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.723234 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-config-data\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.733924 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-scripts\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.751848 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flpbv\" (UniqueName: \"kubernetes.io/projected/452cba63-3d2d-41ae-a655-2f9b6cf932e9-kube-api-access-flpbv\") pod \"nova-kuttl-cell0-cell-mapping-dt5dh\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.788881 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.790095 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.793577 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-novncproxy-config-data" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.805376 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.809843 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.885093 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.886440 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.889308 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.900239 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.920290 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/eff533f9-5b87-4048-a157-23b2f93578db-kube-api-access-tvs6s\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.920368 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff533f9-5b87-4048-a157-23b2f93578db-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.994810 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:12:59 crc kubenswrapper[4842]: I0311 19:12:59.995915 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.000897 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.024095 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32f0d78-a716-4911-b32c-f5b8c9b17561-logs\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.026454 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/eff533f9-5b87-4048-a157-23b2f93578db-kube-api-access-tvs6s\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.026526 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bw79\" (UniqueName: \"kubernetes.io/projected/b32f0d78-a716-4911-b32c-f5b8c9b17561-kube-api-access-8bw79\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.026604 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff533f9-5b87-4048-a157-23b2f93578db-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.026639 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32f0d78-a716-4911-b32c-f5b8c9b17561-config-data\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.044019 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.089754 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.091556 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.094514 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/eff533f9-5b87-4048-a157-23b2f93578db-kube-api-access-tvs6s\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.095725 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.095769 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff533f9-5b87-4048-a157-23b2f93578db-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.103808 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.118777 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.128049 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/013b01a1-a53b-4c9f-b525-7fc883306119-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.128137 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32f0d78-a716-4911-b32c-f5b8c9b17561-logs\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.128164 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srqm8\" (UniqueName: \"kubernetes.io/projected/013b01a1-a53b-4c9f-b525-7fc883306119-kube-api-access-srqm8\") pod \"nova-kuttl-scheduler-0\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.128207 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bw79\" (UniqueName: \"kubernetes.io/projected/b32f0d78-a716-4911-b32c-f5b8c9b17561-kube-api-access-8bw79\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.128237 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32f0d78-a716-4911-b32c-f5b8c9b17561-config-data\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.130734 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32f0d78-a716-4911-b32c-f5b8c9b17561-logs\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.133592 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32f0d78-a716-4911-b32c-f5b8c9b17561-config-data\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.170088 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bw79\" (UniqueName: \"kubernetes.io/projected/b32f0d78-a716-4911-b32c-f5b8c9b17561-kube-api-access-8bw79\") pod \"nova-kuttl-api-0\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.206307 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.230341 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14af192a-7cc9-46fc-9517-3005bfe64806-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.230412 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/013b01a1-a53b-4c9f-b525-7fc883306119-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.230455 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14af192a-7cc9-46fc-9517-3005bfe64806-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.230517 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srqm8\" (UniqueName: \"kubernetes.io/projected/013b01a1-a53b-4c9f-b525-7fc883306119-kube-api-access-srqm8\") pod \"nova-kuttl-scheduler-0\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.230582 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dsx8\" (UniqueName: \"kubernetes.io/projected/14af192a-7cc9-46fc-9517-3005bfe64806-kube-api-access-6dsx8\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.234017 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/013b01a1-a53b-4c9f-b525-7fc883306119-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.255169 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srqm8\" (UniqueName: \"kubernetes.io/projected/013b01a1-a53b-4c9f-b525-7fc883306119-kube-api-access-srqm8\") pod \"nova-kuttl-scheduler-0\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.332506 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dsx8\" (UniqueName: \"kubernetes.io/projected/14af192a-7cc9-46fc-9517-3005bfe64806-kube-api-access-6dsx8\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.332583 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14af192a-7cc9-46fc-9517-3005bfe64806-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.332606 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14af192a-7cc9-46fc-9517-3005bfe64806-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.333899 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14af192a-7cc9-46fc-9517-3005bfe64806-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.335845 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14af192a-7cc9-46fc-9517-3005bfe64806-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.336036 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.363576 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh"] Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.395850 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dsx8\" (UniqueName: \"kubernetes.io/projected/14af192a-7cc9-46fc-9517-3005bfe64806-kube-api-access-6dsx8\") pod \"nova-kuttl-metadata-0\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.412227 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.623111 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:00 crc kubenswrapper[4842]: W0311 19:13:00.692066 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff533f9_5b87_4048_a157_23b2f93578db.slice/crio-a9822c4b50cd4464bda61edc7b8234c5f790ead8d09c002bb035bcbd668c6b81 WatchSource:0}: Error finding container a9822c4b50cd4464bda61edc7b8234c5f790ead8d09c002bb035bcbd668c6b81: Status 404 returned error can't find the container with id a9822c4b50cd4464bda61edc7b8234c5f790ead8d09c002bb035bcbd668c6b81 Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.696633 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.708906 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" event={"ID":"452cba63-3d2d-41ae-a655-2f9b6cf932e9","Type":"ContainerStarted","Data":"9f6e5f5b3bf5a44341d9b3ee2d7dd27c14dcec54bb08b823665665ce4f37fc3f"} Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.715063 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b32f0d78-a716-4911-b32c-f5b8c9b17561","Type":"ContainerStarted","Data":"41efacfe830b63a3f9de412c664288c48e61dc5bafefd4d4b5457e39840b6370"} Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.791302 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28"] Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.792431 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.794367 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.797670 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-scripts" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.801449 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28"] Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.945626 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.946058 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:00 crc kubenswrapper[4842]: I0311 19:13:00.946097 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2lx\" (UniqueName: \"kubernetes.io/projected/0f912a68-5aee-4b66-9c2b-a25bc7736725-kube-api-access-6m2lx\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.048321 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.048509 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.048608 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2lx\" (UniqueName: \"kubernetes.io/projected/0f912a68-5aee-4b66-9c2b-a25bc7736725-kube-api-access-6m2lx\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.048946 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.056252 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.059219 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.066523 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2lx\" (UniqueName: \"kubernetes.io/projected/0f912a68-5aee-4b66-9c2b-a25bc7736725-kube-api-access-6m2lx\") pod \"nova-kuttl-cell1-conductor-db-sync-ccv28\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.113849 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.248241 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.618601 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28"] Mar 11 19:13:01 crc kubenswrapper[4842]: W0311 19:13:01.634657 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f912a68_5aee_4b66_9c2b_a25bc7736725.slice/crio-f58bfcc0b01fce6bfc957cc2866ea4ff39775f8f38a2ac333ab6f7517e0eb7d4 WatchSource:0}: Error finding container f58bfcc0b01fce6bfc957cc2866ea4ff39775f8f38a2ac333ab6f7517e0eb7d4: Status 404 returned error can't find the container with id f58bfcc0b01fce6bfc957cc2866ea4ff39775f8f38a2ac333ab6f7517e0eb7d4 Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.726579 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" event={"ID":"0f912a68-5aee-4b66-9c2b-a25bc7736725","Type":"ContainerStarted","Data":"f58bfcc0b01fce6bfc957cc2866ea4ff39775f8f38a2ac333ab6f7517e0eb7d4"} Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.732687 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" event={"ID":"452cba63-3d2d-41ae-a655-2f9b6cf932e9","Type":"ContainerStarted","Data":"fbd32d5d5fae101899de3c730b117bf79165112040f42ff40b9a7e49b838f4a8"} Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.739249 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"eff533f9-5b87-4048-a157-23b2f93578db","Type":"ContainerStarted","Data":"a9822c4b50cd4464bda61edc7b8234c5f790ead8d09c002bb035bcbd668c6b81"} Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.742344 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"14af192a-7cc9-46fc-9517-3005bfe64806","Type":"ContainerStarted","Data":"36f25e3562afc1da4069e3314e2f48f136c0ad4c6bec16c9cc5e6559b3950396"} Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.743935 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"013b01a1-a53b-4c9f-b525-7fc883306119","Type":"ContainerStarted","Data":"254e75c191c3f187bbf8e1076f1e90d1209888fff5fdb09db689fab18497370f"} Mar 11 19:13:01 crc kubenswrapper[4842]: I0311 19:13:01.756141 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" podStartSLOduration=2.756118846 podStartE2EDuration="2.756118846s" podCreationTimestamp="2026-03-11 19:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:01.749747684 +0000 UTC m=+1427.397443974" watchObservedRunningTime="2026-03-11 19:13:01.756118846 +0000 UTC m=+1427.403815136" Mar 11 19:13:02 crc kubenswrapper[4842]: I0311 19:13:02.771788 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" event={"ID":"0f912a68-5aee-4b66-9c2b-a25bc7736725","Type":"ContainerStarted","Data":"f74d98a84106985b1b7e16462634e52b457f1df4056f407abbf4a1a02ea95911"} Mar 11 19:13:02 crc kubenswrapper[4842]: I0311 19:13:02.807023 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" podStartSLOduration=2.807004573 podStartE2EDuration="2.807004573s" podCreationTimestamp="2026-03-11 19:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:02.798570775 +0000 UTC m=+1428.446267065" watchObservedRunningTime="2026-03-11 19:13:02.807004573 +0000 UTC m=+1428.454700843" Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.805158 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"14af192a-7cc9-46fc-9517-3005bfe64806","Type":"ContainerStarted","Data":"72633c074e43702d834d4669d166fadd97e000eb7a3fd4646e4b905fe364237e"} Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.805702 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"14af192a-7cc9-46fc-9517-3005bfe64806","Type":"ContainerStarted","Data":"9a57a25614322f25b91be5684d0cd0bf72b3fc584c65909411ead6e5af755ab3"} Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.808644 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"013b01a1-a53b-4c9f-b525-7fc883306119","Type":"ContainerStarted","Data":"054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20"} Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.811774 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"eff533f9-5b87-4048-a157-23b2f93578db","Type":"ContainerStarted","Data":"d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3"} Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.814351 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b32f0d78-a716-4911-b32c-f5b8c9b17561","Type":"ContainerStarted","Data":"c52fea5fcc683c704f3b76dd832e49d4fa54746978560143c46899473e49cf1f"} Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.814381 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b32f0d78-a716-4911-b32c-f5b8c9b17561","Type":"ContainerStarted","Data":"dec1f6a7db47241e7d41782718dd7609b89a7e247c2ef60ab8512b11da23b8f8"} Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.828311 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=1.855581918 podStartE2EDuration="4.828291048s" podCreationTimestamp="2026-03-11 19:13:00 +0000 UTC" firstStartedPulling="2026-03-11 19:13:01.261441217 +0000 UTC m=+1426.909137507" lastFinishedPulling="2026-03-11 19:13:04.234150337 +0000 UTC m=+1429.881846637" observedRunningTime="2026-03-11 19:13:04.825640647 +0000 UTC m=+1430.473336927" watchObservedRunningTime="2026-03-11 19:13:04.828291048 +0000 UTC m=+1430.475987338" Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.878068 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.33575378 podStartE2EDuration="5.87803767s" podCreationTimestamp="2026-03-11 19:12:59 +0000 UTC" firstStartedPulling="2026-03-11 19:13:00.686697608 +0000 UTC m=+1426.334393888" lastFinishedPulling="2026-03-11 19:13:04.228981488 +0000 UTC m=+1429.876677778" observedRunningTime="2026-03-11 19:13:04.849804389 +0000 UTC m=+1430.497500699" watchObservedRunningTime="2026-03-11 19:13:04.87803767 +0000 UTC m=+1430.525733960" Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.882004 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.711357109 podStartE2EDuration="5.881993967s" podCreationTimestamp="2026-03-11 19:12:59 +0000 UTC" firstStartedPulling="2026-03-11 19:13:01.057538628 +0000 UTC m=+1426.705234908" lastFinishedPulling="2026-03-11 19:13:04.228175476 +0000 UTC m=+1429.875871766" observedRunningTime="2026-03-11 19:13:04.879581212 +0000 UTC m=+1430.527277492" watchObservedRunningTime="2026-03-11 19:13:04.881993967 +0000 UTC m=+1430.529690257" Mar 11 19:13:04 crc kubenswrapper[4842]: I0311 19:13:04.933146 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podStartSLOduration=2.403756834 podStartE2EDuration="5.933118235s" podCreationTimestamp="2026-03-11 19:12:59 +0000 UTC" firstStartedPulling="2026-03-11 19:13:00.696933374 +0000 UTC m=+1426.344629654" lastFinishedPulling="2026-03-11 19:13:04.226294775 +0000 UTC m=+1429.873991055" observedRunningTime="2026-03-11 19:13:04.896152368 +0000 UTC m=+1430.543848648" watchObservedRunningTime="2026-03-11 19:13:04.933118235 +0000 UTC m=+1430.580814515" Mar 11 19:13:05 crc kubenswrapper[4842]: I0311 19:13:05.119588 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:05 crc kubenswrapper[4842]: I0311 19:13:05.337061 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:09 crc kubenswrapper[4842]: I0311 19:13:09.861824 4842 generic.go:334] "Generic (PLEG): container finished" podID="452cba63-3d2d-41ae-a655-2f9b6cf932e9" containerID="fbd32d5d5fae101899de3c730b117bf79165112040f42ff40b9a7e49b838f4a8" exitCode=0 Mar 11 19:13:09 crc kubenswrapper[4842]: I0311 19:13:09.861954 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" event={"ID":"452cba63-3d2d-41ae-a655-2f9b6cf932e9","Type":"ContainerDied","Data":"fbd32d5d5fae101899de3c730b117bf79165112040f42ff40b9a7e49b838f4a8"} Mar 11 19:13:09 crc kubenswrapper[4842]: I0311 19:13:09.865573 4842 generic.go:334] "Generic (PLEG): container finished" podID="0f912a68-5aee-4b66-9c2b-a25bc7736725" containerID="f74d98a84106985b1b7e16462634e52b457f1df4056f407abbf4a1a02ea95911" exitCode=0 Mar 11 19:13:09 crc kubenswrapper[4842]: I0311 19:13:09.865633 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" event={"ID":"0f912a68-5aee-4b66-9c2b-a25bc7736725","Type":"ContainerDied","Data":"f74d98a84106985b1b7e16462634e52b457f1df4056f407abbf4a1a02ea95911"} Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.119855 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.132582 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.207494 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.207557 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.336756 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.370837 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.412819 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.412891 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.881485 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:13:10 crc kubenswrapper[4842]: I0311 19:13:10.917451 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.291406 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.134:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.292118 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.134:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.360754 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.365400 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.455996 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-config-data\") pod \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.456119 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-scripts\") pod \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.456161 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m2lx\" (UniqueName: \"kubernetes.io/projected/0f912a68-5aee-4b66-9c2b-a25bc7736725-kube-api-access-6m2lx\") pod \"0f912a68-5aee-4b66-9c2b-a25bc7736725\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.456206 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-config-data\") pod \"0f912a68-5aee-4b66-9c2b-a25bc7736725\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.456333 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flpbv\" (UniqueName: \"kubernetes.io/projected/452cba63-3d2d-41ae-a655-2f9b6cf932e9-kube-api-access-flpbv\") pod \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\" (UID: \"452cba63-3d2d-41ae-a655-2f9b6cf932e9\") " Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.456365 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-scripts\") pod \"0f912a68-5aee-4b66-9c2b-a25bc7736725\" (UID: \"0f912a68-5aee-4b66-9c2b-a25bc7736725\") " Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.462499 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452cba63-3d2d-41ae-a655-2f9b6cf932e9-kube-api-access-flpbv" (OuterVolumeSpecName: "kube-api-access-flpbv") pod "452cba63-3d2d-41ae-a655-2f9b6cf932e9" (UID: "452cba63-3d2d-41ae-a655-2f9b6cf932e9"). InnerVolumeSpecName "kube-api-access-flpbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.464710 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-scripts" (OuterVolumeSpecName: "scripts") pod "0f912a68-5aee-4b66-9c2b-a25bc7736725" (UID: "0f912a68-5aee-4b66-9c2b-a25bc7736725"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.467546 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f912a68-5aee-4b66-9c2b-a25bc7736725-kube-api-access-6m2lx" (OuterVolumeSpecName: "kube-api-access-6m2lx") pod "0f912a68-5aee-4b66-9c2b-a25bc7736725" (UID: "0f912a68-5aee-4b66-9c2b-a25bc7736725"). InnerVolumeSpecName "kube-api-access-6m2lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.480780 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-scripts" (OuterVolumeSpecName: "scripts") pod "452cba63-3d2d-41ae-a655-2f9b6cf932e9" (UID: "452cba63-3d2d-41ae-a655-2f9b6cf932e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.497485 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.136:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.497943 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.136:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.505578 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-config-data" (OuterVolumeSpecName: "config-data") pod "452cba63-3d2d-41ae-a655-2f9b6cf932e9" (UID: "452cba63-3d2d-41ae-a655-2f9b6cf932e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.507362 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-config-data" (OuterVolumeSpecName: "config-data") pod "0f912a68-5aee-4b66-9c2b-a25bc7736725" (UID: "0f912a68-5aee-4b66-9c2b-a25bc7736725"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.558331 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.558369 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flpbv\" (UniqueName: \"kubernetes.io/projected/452cba63-3d2d-41ae-a655-2f9b6cf932e9-kube-api-access-flpbv\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.558379 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f912a68-5aee-4b66-9c2b-a25bc7736725-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.558388 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.558398 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/452cba63-3d2d-41ae-a655-2f9b6cf932e9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.558408 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m2lx\" (UniqueName: \"kubernetes.io/projected/0f912a68-5aee-4b66-9c2b-a25bc7736725-kube-api-access-6m2lx\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.888908 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.889792 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28" event={"ID":"0f912a68-5aee-4b66-9c2b-a25bc7736725","Type":"ContainerDied","Data":"f58bfcc0b01fce6bfc957cc2866ea4ff39775f8f38a2ac333ab6f7517e0eb7d4"} Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.889866 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f58bfcc0b01fce6bfc957cc2866ea4ff39775f8f38a2ac333ab6f7517e0eb7d4" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.894212 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.897569 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh" event={"ID":"452cba63-3d2d-41ae-a655-2f9b6cf932e9","Type":"ContainerDied","Data":"9f6e5f5b3bf5a44341d9b3ee2d7dd27c14dcec54bb08b823665665ce4f37fc3f"} Mar 11 19:13:11 crc kubenswrapper[4842]: I0311 19:13:11.897651 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6e5f5b3bf5a44341d9b3ee2d7dd27c14dcec54bb08b823665665ce4f37fc3f" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.010619 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:13:12 crc kubenswrapper[4842]: E0311 19:13:12.010962 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452cba63-3d2d-41ae-a655-2f9b6cf932e9" containerName="nova-manage" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.010977 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="452cba63-3d2d-41ae-a655-2f9b6cf932e9" containerName="nova-manage" Mar 11 19:13:12 crc kubenswrapper[4842]: E0311 19:13:12.011002 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f912a68-5aee-4b66-9c2b-a25bc7736725" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.011011 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f912a68-5aee-4b66-9c2b-a25bc7736725" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.011158 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f912a68-5aee-4b66-9c2b-a25bc7736725" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.011176 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="452cba63-3d2d-41ae-a655-2f9b6cf932e9" containerName="nova-manage" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.011727 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.018295 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.029627 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.169885 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9cpz\" (UniqueName: \"kubernetes.io/projected/72beb476-94c1-4b97-bb3a-544abc447548-kube-api-access-x9cpz\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.170085 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72beb476-94c1-4b97-bb3a-544abc447548-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.207745 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.208020 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-log" containerID="cri-o://dec1f6a7db47241e7d41782718dd7609b89a7e247c2ef60ab8512b11da23b8f8" gracePeriod=30 Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.208138 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-api" containerID="cri-o://c52fea5fcc683c704f3b76dd832e49d4fa54746978560143c46899473e49cf1f" gracePeriod=30 Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.223005 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.272479 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9cpz\" (UniqueName: \"kubernetes.io/projected/72beb476-94c1-4b97-bb3a-544abc447548-kube-api-access-x9cpz\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.272588 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72beb476-94c1-4b97-bb3a-544abc447548-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.282687 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72beb476-94c1-4b97-bb3a-544abc447548-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.297618 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9cpz\" (UniqueName: \"kubernetes.io/projected/72beb476-94c1-4b97-bb3a-544abc447548-kube-api-access-x9cpz\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.329952 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.346469 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.346699 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-log" containerID="cri-o://9a57a25614322f25b91be5684d0cd0bf72b3fc584c65909411ead6e5af755ab3" gracePeriod=30 Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.347125 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://72633c074e43702d834d4669d166fadd97e000eb7a3fd4646e4b905fe364237e" gracePeriod=30 Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.816810 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:13:12 crc kubenswrapper[4842]: W0311 19:13:12.817437 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72beb476_94c1_4b97_bb3a_544abc447548.slice/crio-b6ab19762c847a6837d092cefb3e159edc6a74d365ce2d55db4ac764160fb7b7 WatchSource:0}: Error finding container b6ab19762c847a6837d092cefb3e159edc6a74d365ce2d55db4ac764160fb7b7: Status 404 returned error can't find the container with id b6ab19762c847a6837d092cefb3e159edc6a74d365ce2d55db4ac764160fb7b7 Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.917786 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"72beb476-94c1-4b97-bb3a-544abc447548","Type":"ContainerStarted","Data":"b6ab19762c847a6837d092cefb3e159edc6a74d365ce2d55db4ac764160fb7b7"} Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.926477 4842 generic.go:334] "Generic (PLEG): container finished" podID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerID="dec1f6a7db47241e7d41782718dd7609b89a7e247c2ef60ab8512b11da23b8f8" exitCode=143 Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.926570 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b32f0d78-a716-4911-b32c-f5b8c9b17561","Type":"ContainerDied","Data":"dec1f6a7db47241e7d41782718dd7609b89a7e247c2ef60ab8512b11da23b8f8"} Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.928997 4842 generic.go:334] "Generic (PLEG): container finished" podID="14af192a-7cc9-46fc-9517-3005bfe64806" containerID="9a57a25614322f25b91be5684d0cd0bf72b3fc584c65909411ead6e5af755ab3" exitCode=143 Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.929126 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"14af192a-7cc9-46fc-9517-3005bfe64806","Type":"ContainerDied","Data":"9a57a25614322f25b91be5684d0cd0bf72b3fc584c65909411ead6e5af755ab3"} Mar 11 19:13:12 crc kubenswrapper[4842]: I0311 19:13:12.929176 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="013b01a1-a53b-4c9f-b525-7fc883306119" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20" gracePeriod=30 Mar 11 19:13:13 crc kubenswrapper[4842]: I0311 19:13:13.942801 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"72beb476-94c1-4b97-bb3a-544abc447548","Type":"ContainerStarted","Data":"ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3"} Mar 11 19:13:13 crc kubenswrapper[4842]: I0311 19:13:13.944123 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:13 crc kubenswrapper[4842]: I0311 19:13:13.967292 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=2.967246145 podStartE2EDuration="2.967246145s" podCreationTimestamp="2026-03-11 19:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:13.966668079 +0000 UTC m=+1439.614364399" watchObservedRunningTime="2026-03-11 19:13:13.967246145 +0000 UTC m=+1439.614942435" Mar 11 19:13:15 crc kubenswrapper[4842]: E0311 19:13:15.340555 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:13:15 crc kubenswrapper[4842]: E0311 19:13:15.343105 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:13:15 crc kubenswrapper[4842]: E0311 19:13:15.345746 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:13:15 crc kubenswrapper[4842]: E0311 19:13:15.345821 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="013b01a1-a53b-4c9f-b525-7fc883306119" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:15 crc kubenswrapper[4842]: I0311 19:13:15.998338 4842 generic.go:334] "Generic (PLEG): container finished" podID="013b01a1-a53b-4c9f-b525-7fc883306119" containerID="054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20" exitCode=0 Mar 11 19:13:15 crc kubenswrapper[4842]: I0311 19:13:15.998721 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"013b01a1-a53b-4c9f-b525-7fc883306119","Type":"ContainerDied","Data":"054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20"} Mar 11 19:13:16 crc kubenswrapper[4842]: I0311 19:13:16.328798 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:16 crc kubenswrapper[4842]: I0311 19:13:16.452018 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srqm8\" (UniqueName: \"kubernetes.io/projected/013b01a1-a53b-4c9f-b525-7fc883306119-kube-api-access-srqm8\") pod \"013b01a1-a53b-4c9f-b525-7fc883306119\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " Mar 11 19:13:16 crc kubenswrapper[4842]: I0311 19:13:16.452147 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/013b01a1-a53b-4c9f-b525-7fc883306119-config-data\") pod \"013b01a1-a53b-4c9f-b525-7fc883306119\" (UID: \"013b01a1-a53b-4c9f-b525-7fc883306119\") " Mar 11 19:13:16 crc kubenswrapper[4842]: I0311 19:13:16.473539 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013b01a1-a53b-4c9f-b525-7fc883306119-kube-api-access-srqm8" (OuterVolumeSpecName: "kube-api-access-srqm8") pod "013b01a1-a53b-4c9f-b525-7fc883306119" (UID: "013b01a1-a53b-4c9f-b525-7fc883306119"). InnerVolumeSpecName "kube-api-access-srqm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:16 crc kubenswrapper[4842]: I0311 19:13:16.476687 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013b01a1-a53b-4c9f-b525-7fc883306119-config-data" (OuterVolumeSpecName: "config-data") pod "013b01a1-a53b-4c9f-b525-7fc883306119" (UID: "013b01a1-a53b-4c9f-b525-7fc883306119"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:16 crc kubenswrapper[4842]: I0311 19:13:16.554333 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srqm8\" (UniqueName: \"kubernetes.io/projected/013b01a1-a53b-4c9f-b525-7fc883306119-kube-api-access-srqm8\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:16 crc kubenswrapper[4842]: I0311 19:13:16.554378 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/013b01a1-a53b-4c9f-b525-7fc883306119-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.013673 4842 generic.go:334] "Generic (PLEG): container finished" podID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerID="c52fea5fcc683c704f3b76dd832e49d4fa54746978560143c46899473e49cf1f" exitCode=0 Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.013749 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b32f0d78-a716-4911-b32c-f5b8c9b17561","Type":"ContainerDied","Data":"c52fea5fcc683c704f3b76dd832e49d4fa54746978560143c46899473e49cf1f"} Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.014070 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b32f0d78-a716-4911-b32c-f5b8c9b17561","Type":"ContainerDied","Data":"41efacfe830b63a3f9de412c664288c48e61dc5bafefd4d4b5457e39840b6370"} Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.014087 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41efacfe830b63a3f9de412c664288c48e61dc5bafefd4d4b5457e39840b6370" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.017146 4842 generic.go:334] "Generic (PLEG): container finished" podID="14af192a-7cc9-46fc-9517-3005bfe64806" containerID="72633c074e43702d834d4669d166fadd97e000eb7a3fd4646e4b905fe364237e" exitCode=0 Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.017208 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"14af192a-7cc9-46fc-9517-3005bfe64806","Type":"ContainerDied","Data":"72633c074e43702d834d4669d166fadd97e000eb7a3fd4646e4b905fe364237e"} Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.021371 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"013b01a1-a53b-4c9f-b525-7fc883306119","Type":"ContainerDied","Data":"254e75c191c3f187bbf8e1076f1e90d1209888fff5fdb09db689fab18497370f"} Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.021406 4842 scope.go:117] "RemoveContainer" containerID="054133405d2ca2568a5818546ae52cf8a05105b164a15a5b8ee14dcb000a5d20" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.021548 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.045003 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.070839 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.099416 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.129550 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:17 crc kubenswrapper[4842]: E0311 19:13:17.130490 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013b01a1-a53b-4c9f-b525-7fc883306119" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.130536 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="013b01a1-a53b-4c9f-b525-7fc883306119" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:17 crc kubenswrapper[4842]: E0311 19:13:17.130621 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-log" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.130629 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-log" Mar 11 19:13:17 crc kubenswrapper[4842]: E0311 19:13:17.130654 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-api" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.130664 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-api" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.131155 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-api" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.131174 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="013b01a1-a53b-4c9f-b525-7fc883306119" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.131191 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" containerName="nova-kuttl-api-log" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.132339 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.141408 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.163838 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32f0d78-a716-4911-b32c-f5b8c9b17561-logs\") pod \"b32f0d78-a716-4911-b32c-f5b8c9b17561\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.163881 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bw79\" (UniqueName: \"kubernetes.io/projected/b32f0d78-a716-4911-b32c-f5b8c9b17561-kube-api-access-8bw79\") pod \"b32f0d78-a716-4911-b32c-f5b8c9b17561\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.163904 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32f0d78-a716-4911-b32c-f5b8c9b17561-config-data\") pod \"b32f0d78-a716-4911-b32c-f5b8c9b17561\" (UID: \"b32f0d78-a716-4911-b32c-f5b8c9b17561\") " Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.164614 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b32f0d78-a716-4911-b32c-f5b8c9b17561-logs" (OuterVolumeSpecName: "logs") pod "b32f0d78-a716-4911-b32c-f5b8c9b17561" (UID: "b32f0d78-a716-4911-b32c-f5b8c9b17561"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.168573 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32f0d78-a716-4911-b32c-f5b8c9b17561-kube-api-access-8bw79" (OuterVolumeSpecName: "kube-api-access-8bw79") pod "b32f0d78-a716-4911-b32c-f5b8c9b17561" (UID: "b32f0d78-a716-4911-b32c-f5b8c9b17561"). InnerVolumeSpecName "kube-api-access-8bw79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.173540 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.184337 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32f0d78-a716-4911-b32c-f5b8c9b17561-config-data" (OuterVolumeSpecName: "config-data") pod "b32f0d78-a716-4911-b32c-f5b8c9b17561" (UID: "b32f0d78-a716-4911-b32c-f5b8c9b17561"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.232726 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.265501 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c772b4-865d-48e4-bddd-28346fd4ae3b-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.265597 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmzfl\" (UniqueName: \"kubernetes.io/projected/57c772b4-865d-48e4-bddd-28346fd4ae3b-kube-api-access-wmzfl\") pod \"nova-kuttl-scheduler-0\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.265687 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b32f0d78-a716-4911-b32c-f5b8c9b17561-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.265703 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bw79\" (UniqueName: \"kubernetes.io/projected/b32f0d78-a716-4911-b32c-f5b8c9b17561-kube-api-access-8bw79\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.265712 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32f0d78-a716-4911-b32c-f5b8c9b17561-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.367093 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14af192a-7cc9-46fc-9517-3005bfe64806-config-data\") pod \"14af192a-7cc9-46fc-9517-3005bfe64806\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.367166 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14af192a-7cc9-46fc-9517-3005bfe64806-logs\") pod \"14af192a-7cc9-46fc-9517-3005bfe64806\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.367230 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dsx8\" (UniqueName: \"kubernetes.io/projected/14af192a-7cc9-46fc-9517-3005bfe64806-kube-api-access-6dsx8\") pod \"14af192a-7cc9-46fc-9517-3005bfe64806\" (UID: \"14af192a-7cc9-46fc-9517-3005bfe64806\") " Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.367534 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c772b4-865d-48e4-bddd-28346fd4ae3b-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.367640 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmzfl\" (UniqueName: \"kubernetes.io/projected/57c772b4-865d-48e4-bddd-28346fd4ae3b-kube-api-access-wmzfl\") pod \"nova-kuttl-scheduler-0\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.367953 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14af192a-7cc9-46fc-9517-3005bfe64806-logs" (OuterVolumeSpecName: "logs") pod "14af192a-7cc9-46fc-9517-3005bfe64806" (UID: "14af192a-7cc9-46fc-9517-3005bfe64806"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.368415 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14af192a-7cc9-46fc-9517-3005bfe64806-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.370938 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14af192a-7cc9-46fc-9517-3005bfe64806-kube-api-access-6dsx8" (OuterVolumeSpecName: "kube-api-access-6dsx8") pod "14af192a-7cc9-46fc-9517-3005bfe64806" (UID: "14af192a-7cc9-46fc-9517-3005bfe64806"). InnerVolumeSpecName "kube-api-access-6dsx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.372060 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c772b4-865d-48e4-bddd-28346fd4ae3b-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.394860 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmzfl\" (UniqueName: \"kubernetes.io/projected/57c772b4-865d-48e4-bddd-28346fd4ae3b-kube-api-access-wmzfl\") pod \"nova-kuttl-scheduler-0\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.400957 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14af192a-7cc9-46fc-9517-3005bfe64806-config-data" (OuterVolumeSpecName: "config-data") pod "14af192a-7cc9-46fc-9517-3005bfe64806" (UID: "14af192a-7cc9-46fc-9517-3005bfe64806"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.462763 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.469827 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14af192a-7cc9-46fc-9517-3005bfe64806-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:17 crc kubenswrapper[4842]: I0311 19:13:17.469857 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dsx8\" (UniqueName: \"kubernetes.io/projected/14af192a-7cc9-46fc-9517-3005bfe64806-kube-api-access-6dsx8\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.031651 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"14af192a-7cc9-46fc-9517-3005bfe64806","Type":"ContainerDied","Data":"36f25e3562afc1da4069e3314e2f48f136c0ad4c6bec16c9cc5e6559b3950396"} Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.031687 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.032048 4842 scope.go:117] "RemoveContainer" containerID="72633c074e43702d834d4669d166fadd97e000eb7a3fd4646e4b905fe364237e" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.032561 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.053044 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.069014 4842 scope.go:117] "RemoveContainer" containerID="9a57a25614322f25b91be5684d0cd0bf72b3fc584c65909411ead6e5af755ab3" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.205290 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.215253 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.222624 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.231760 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.240055 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: E0311 19:13:18.241066 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-metadata" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.241212 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-metadata" Mar 11 19:13:18 crc kubenswrapper[4842]: E0311 19:13:18.241380 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-log" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.241533 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-log" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.242001 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-log" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.242134 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" containerName="nova-kuttl-metadata-metadata" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.244213 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.246618 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.248706 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.249260 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.253228 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.286874 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da239a7e-377d-4e6c-b6f8-a159ad929950-logs\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.286922 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ntz\" (UniqueName: \"kubernetes.io/projected/da239a7e-377d-4e6c-b6f8-a159ad929950-kube-api-access-g2ntz\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.287042 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da239a7e-377d-4e6c-b6f8-a159ad929950-config-data\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.287097 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j29t\" (UniqueName: \"kubernetes.io/projected/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-kube-api-access-4j29t\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.287136 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.287193 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.292330 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.306872 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.388400 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.388450 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da239a7e-377d-4e6c-b6f8-a159ad929950-logs\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.388482 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ntz\" (UniqueName: \"kubernetes.io/projected/da239a7e-377d-4e6c-b6f8-a159ad929950-kube-api-access-g2ntz\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.388537 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da239a7e-377d-4e6c-b6f8-a159ad929950-config-data\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.388868 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j29t\" (UniqueName: \"kubernetes.io/projected/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-kube-api-access-4j29t\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.388982 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.388907 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.389787 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da239a7e-377d-4e6c-b6f8-a159ad929950-logs\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.393869 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da239a7e-377d-4e6c-b6f8-a159ad929950-config-data\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.400963 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.410062 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ntz\" (UniqueName: \"kubernetes.io/projected/da239a7e-377d-4e6c-b6f8-a159ad929950-kube-api-access-g2ntz\") pod \"nova-kuttl-api-0\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.412785 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j29t\" (UniqueName: \"kubernetes.io/projected/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-kube-api-access-4j29t\") pod \"nova-kuttl-metadata-0\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.575037 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.586546 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:18 crc kubenswrapper[4842]: W0311 19:13:18.945066 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda239a7e_377d_4e6c_b6f8_a159ad929950.slice/crio-54b634c5a87f836a3f2893362b2cf7acae003ea48f86ec6bf5a17ec514e75134 WatchSource:0}: Error finding container 54b634c5a87f836a3f2893362b2cf7acae003ea48f86ec6bf5a17ec514e75134: Status 404 returned error can't find the container with id 54b634c5a87f836a3f2893362b2cf7acae003ea48f86ec6bf5a17ec514e75134 Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.949223 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.977419 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013b01a1-a53b-4c9f-b525-7fc883306119" path="/var/lib/kubelet/pods/013b01a1-a53b-4c9f-b525-7fc883306119/volumes" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.978234 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14af192a-7cc9-46fc-9517-3005bfe64806" path="/var/lib/kubelet/pods/14af192a-7cc9-46fc-9517-3005bfe64806/volumes" Mar 11 19:13:18 crc kubenswrapper[4842]: I0311 19:13:18.978838 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32f0d78-a716-4911-b32c-f5b8c9b17561" path="/var/lib/kubelet/pods/b32f0d78-a716-4911-b32c-f5b8c9b17561/volumes" Mar 11 19:13:19 crc kubenswrapper[4842]: I0311 19:13:19.049589 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"57c772b4-865d-48e4-bddd-28346fd4ae3b","Type":"ContainerStarted","Data":"2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3"} Mar 11 19:13:19 crc kubenswrapper[4842]: I0311 19:13:19.049631 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"57c772b4-865d-48e4-bddd-28346fd4ae3b","Type":"ContainerStarted","Data":"fbc3e4fdb4c9085d3fe2b6dcc5353ae202b13d26f44531abbc290f5b4147c2b5"} Mar 11 19:13:19 crc kubenswrapper[4842]: I0311 19:13:19.053541 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"da239a7e-377d-4e6c-b6f8-a159ad929950","Type":"ContainerStarted","Data":"54b634c5a87f836a3f2893362b2cf7acae003ea48f86ec6bf5a17ec514e75134"} Mar 11 19:13:19 crc kubenswrapper[4842]: I0311 19:13:19.069199 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.069176031 podStartE2EDuration="2.069176031s" podCreationTimestamp="2026-03-11 19:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:19.068845742 +0000 UTC m=+1444.716542042" watchObservedRunningTime="2026-03-11 19:13:19.069176031 +0000 UTC m=+1444.716872321" Mar 11 19:13:19 crc kubenswrapper[4842]: I0311 19:13:19.100823 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:19 crc kubenswrapper[4842]: W0311 19:13:19.106973 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59ab8561_32d5_4ec9_a2bb_d859ef4b03cf.slice/crio-7aa5e72eb43ae9b51ef3cf67f60202fc59348557b3ee3c6e8d7d03377bacd7fe WatchSource:0}: Error finding container 7aa5e72eb43ae9b51ef3cf67f60202fc59348557b3ee3c6e8d7d03377bacd7fe: Status 404 returned error can't find the container with id 7aa5e72eb43ae9b51ef3cf67f60202fc59348557b3ee3c6e8d7d03377bacd7fe Mar 11 19:13:20 crc kubenswrapper[4842]: I0311 19:13:20.069743 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"da239a7e-377d-4e6c-b6f8-a159ad929950","Type":"ContainerStarted","Data":"7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1"} Mar 11 19:13:20 crc kubenswrapper[4842]: I0311 19:13:20.071171 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"da239a7e-377d-4e6c-b6f8-a159ad929950","Type":"ContainerStarted","Data":"df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718"} Mar 11 19:13:20 crc kubenswrapper[4842]: I0311 19:13:20.072577 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf","Type":"ContainerStarted","Data":"4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a"} Mar 11 19:13:20 crc kubenswrapper[4842]: I0311 19:13:20.072614 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf","Type":"ContainerStarted","Data":"a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3"} Mar 11 19:13:20 crc kubenswrapper[4842]: I0311 19:13:20.072627 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf","Type":"ContainerStarted","Data":"7aa5e72eb43ae9b51ef3cf67f60202fc59348557b3ee3c6e8d7d03377bacd7fe"} Mar 11 19:13:20 crc kubenswrapper[4842]: I0311 19:13:20.120450 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.120414219 podStartE2EDuration="2.120414219s" podCreationTimestamp="2026-03-11 19:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:20.093430091 +0000 UTC m=+1445.741126371" watchObservedRunningTime="2026-03-11 19:13:20.120414219 +0000 UTC m=+1445.768110499" Mar 11 19:13:20 crc kubenswrapper[4842]: I0311 19:13:20.121281 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.121259292 podStartE2EDuration="2.121259292s" podCreationTimestamp="2026-03-11 19:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:20.117525851 +0000 UTC m=+1445.765222131" watchObservedRunningTime="2026-03-11 19:13:20.121259292 +0000 UTC m=+1445.768955572" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.364607 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.463687 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.895743 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg"] Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.897683 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.901008 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-scripts" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.901077 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-config-data" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.906097 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg"] Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.990498 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-config-data\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.990977 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbcj\" (UniqueName: \"kubernetes.io/projected/042cdb69-b291-4c2d-a73b-e5ed47ee3760-kube-api-access-kvbcj\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:22 crc kubenswrapper[4842]: I0311 19:13:22.991091 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-scripts\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.092976 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbcj\" (UniqueName: \"kubernetes.io/projected/042cdb69-b291-4c2d-a73b-e5ed47ee3760-kube-api-access-kvbcj\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.093065 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-scripts\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.093104 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-config-data\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.107850 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-config-data\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.111355 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-scripts\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.118741 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbcj\" (UniqueName: \"kubernetes.io/projected/042cdb69-b291-4c2d-a73b-e5ed47ee3760-kube-api-access-kvbcj\") pod \"nova-kuttl-cell1-cell-mapping-v7mgg\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.222228 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:23 crc kubenswrapper[4842]: I0311 19:13:23.742194 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg"] Mar 11 19:13:24 crc kubenswrapper[4842]: I0311 19:13:24.142874 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" event={"ID":"042cdb69-b291-4c2d-a73b-e5ed47ee3760","Type":"ContainerStarted","Data":"92ce0de87c594880408b7f38d1af77724abee78631a9a283315bf5635ba42013"} Mar 11 19:13:24 crc kubenswrapper[4842]: I0311 19:13:24.143961 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" event={"ID":"042cdb69-b291-4c2d-a73b-e5ed47ee3760","Type":"ContainerStarted","Data":"563a59bf68ca8723bb5e5494ca668953bfc84c2d1540b3410eb2d540a8590ddd"} Mar 11 19:13:24 crc kubenswrapper[4842]: I0311 19:13:24.169527 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" podStartSLOduration=2.169489793 podStartE2EDuration="2.169489793s" podCreationTimestamp="2026-03-11 19:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:24.162085274 +0000 UTC m=+1449.809781574" watchObservedRunningTime="2026-03-11 19:13:24.169489793 +0000 UTC m=+1449.817186093" Mar 11 19:13:27 crc kubenswrapper[4842]: I0311 19:13:27.463543 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:27 crc kubenswrapper[4842]: I0311 19:13:27.490603 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:28 crc kubenswrapper[4842]: I0311 19:13:28.220343 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:28 crc kubenswrapper[4842]: I0311 19:13:28.577321 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:28 crc kubenswrapper[4842]: I0311 19:13:28.577586 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:28 crc kubenswrapper[4842]: I0311 19:13:28.586991 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:28 crc kubenswrapper[4842]: I0311 19:13:28.587863 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:29 crc kubenswrapper[4842]: I0311 19:13:29.202929 4842 generic.go:334] "Generic (PLEG): container finished" podID="042cdb69-b291-4c2d-a73b-e5ed47ee3760" containerID="92ce0de87c594880408b7f38d1af77724abee78631a9a283315bf5635ba42013" exitCode=0 Mar 11 19:13:29 crc kubenswrapper[4842]: I0311 19:13:29.203040 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" event={"ID":"042cdb69-b291-4c2d-a73b-e5ed47ee3760","Type":"ContainerDied","Data":"92ce0de87c594880408b7f38d1af77724abee78631a9a283315bf5635ba42013"} Mar 11 19:13:29 crc kubenswrapper[4842]: I0311 19:13:29.741545 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.141:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:29 crc kubenswrapper[4842]: I0311 19:13:29.741543 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.140:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:29 crc kubenswrapper[4842]: I0311 19:13:29.741729 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.141:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:29 crc kubenswrapper[4842]: I0311 19:13:29.741743 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.140:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.544519 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.733850 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-config-data\") pod \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.733994 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvbcj\" (UniqueName: \"kubernetes.io/projected/042cdb69-b291-4c2d-a73b-e5ed47ee3760-kube-api-access-kvbcj\") pod \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.734072 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-scripts\") pod \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\" (UID: \"042cdb69-b291-4c2d-a73b-e5ed47ee3760\") " Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.752933 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-scripts" (OuterVolumeSpecName: "scripts") pod "042cdb69-b291-4c2d-a73b-e5ed47ee3760" (UID: "042cdb69-b291-4c2d-a73b-e5ed47ee3760"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.752986 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042cdb69-b291-4c2d-a73b-e5ed47ee3760-kube-api-access-kvbcj" (OuterVolumeSpecName: "kube-api-access-kvbcj") pod "042cdb69-b291-4c2d-a73b-e5ed47ee3760" (UID: "042cdb69-b291-4c2d-a73b-e5ed47ee3760"). InnerVolumeSpecName "kube-api-access-kvbcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.761650 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-config-data" (OuterVolumeSpecName: "config-data") pod "042cdb69-b291-4c2d-a73b-e5ed47ee3760" (UID: "042cdb69-b291-4c2d-a73b-e5ed47ee3760"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.836445 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.836717 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvbcj\" (UniqueName: \"kubernetes.io/projected/042cdb69-b291-4c2d-a73b-e5ed47ee3760-kube-api-access-kvbcj\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:30 crc kubenswrapper[4842]: I0311 19:13:30.836728 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/042cdb69-b291-4c2d-a73b-e5ed47ee3760-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.218711 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" event={"ID":"042cdb69-b291-4c2d-a73b-e5ed47ee3760","Type":"ContainerDied","Data":"563a59bf68ca8723bb5e5494ca668953bfc84c2d1540b3410eb2d540a8590ddd"} Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.218750 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563a59bf68ca8723bb5e5494ca668953bfc84c2d1540b3410eb2d540a8590ddd" Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.218804 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg" Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.422051 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.422405 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-log" containerID="cri-o://df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718" gracePeriod=30 Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.422462 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-api" containerID="cri-o://7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1" gracePeriod=30 Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.441769 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.441974 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="57c772b4-865d-48e4-bddd-28346fd4ae3b" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" gracePeriod=30 Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.543706 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.543929 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-log" containerID="cri-o://a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3" gracePeriod=30 Mar 11 19:13:31 crc kubenswrapper[4842]: I0311 19:13:31.544021 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a" gracePeriod=30 Mar 11 19:13:32 crc kubenswrapper[4842]: I0311 19:13:32.227370 4842 generic.go:334] "Generic (PLEG): container finished" podID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerID="df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718" exitCode=143 Mar 11 19:13:32 crc kubenswrapper[4842]: I0311 19:13:32.227435 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"da239a7e-377d-4e6c-b6f8-a159ad929950","Type":"ContainerDied","Data":"df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718"} Mar 11 19:13:32 crc kubenswrapper[4842]: I0311 19:13:32.229052 4842 generic.go:334] "Generic (PLEG): container finished" podID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerID="a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3" exitCode=143 Mar 11 19:13:32 crc kubenswrapper[4842]: I0311 19:13:32.229080 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf","Type":"ContainerDied","Data":"a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3"} Mar 11 19:13:32 crc kubenswrapper[4842]: E0311 19:13:32.465005 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:13:32 crc kubenswrapper[4842]: E0311 19:13:32.466293 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:13:32 crc kubenswrapper[4842]: E0311 19:13:32.467483 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:13:32 crc kubenswrapper[4842]: E0311 19:13:32.467561 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="57c772b4-865d-48e4-bddd-28346fd4ae3b" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:34 crc kubenswrapper[4842]: I0311 19:13:34.992591 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.023006 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.088889 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.101444 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c772b4-865d-48e4-bddd-28346fd4ae3b-config-data\") pod \"57c772b4-865d-48e4-bddd-28346fd4ae3b\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.101605 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmzfl\" (UniqueName: \"kubernetes.io/projected/57c772b4-865d-48e4-bddd-28346fd4ae3b-kube-api-access-wmzfl\") pod \"57c772b4-865d-48e4-bddd-28346fd4ae3b\" (UID: \"57c772b4-865d-48e4-bddd-28346fd4ae3b\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.108085 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c772b4-865d-48e4-bddd-28346fd4ae3b-kube-api-access-wmzfl" (OuterVolumeSpecName: "kube-api-access-wmzfl") pod "57c772b4-865d-48e4-bddd-28346fd4ae3b" (UID: "57c772b4-865d-48e4-bddd-28346fd4ae3b"). InnerVolumeSpecName "kube-api-access-wmzfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.141403 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57c772b4-865d-48e4-bddd-28346fd4ae3b-config-data" (OuterVolumeSpecName: "config-data") pod "57c772b4-865d-48e4-bddd-28346fd4ae3b" (UID: "57c772b4-865d-48e4-bddd-28346fd4ae3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.202896 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-config-data\") pod \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.202956 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da239a7e-377d-4e6c-b6f8-a159ad929950-config-data\") pod \"da239a7e-377d-4e6c-b6f8-a159ad929950\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203010 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j29t\" (UniqueName: \"kubernetes.io/projected/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-kube-api-access-4j29t\") pod \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203056 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-logs\") pod \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\" (UID: \"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203109 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da239a7e-377d-4e6c-b6f8-a159ad929950-logs\") pod \"da239a7e-377d-4e6c-b6f8-a159ad929950\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203186 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2ntz\" (UniqueName: \"kubernetes.io/projected/da239a7e-377d-4e6c-b6f8-a159ad929950-kube-api-access-g2ntz\") pod \"da239a7e-377d-4e6c-b6f8-a159ad929950\" (UID: \"da239a7e-377d-4e6c-b6f8-a159ad929950\") " Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203615 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmzfl\" (UniqueName: \"kubernetes.io/projected/57c772b4-865d-48e4-bddd-28346fd4ae3b-kube-api-access-wmzfl\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203639 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c772b4-865d-48e4-bddd-28346fd4ae3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203717 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-logs" (OuterVolumeSpecName: "logs") pod "59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" (UID: "59ab8561-32d5-4ec9-a2bb-d859ef4b03cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.203727 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da239a7e-377d-4e6c-b6f8-a159ad929950-logs" (OuterVolumeSpecName: "logs") pod "da239a7e-377d-4e6c-b6f8-a159ad929950" (UID: "da239a7e-377d-4e6c-b6f8-a159ad929950"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.205847 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-kube-api-access-4j29t" (OuterVolumeSpecName: "kube-api-access-4j29t") pod "59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" (UID: "59ab8561-32d5-4ec9-a2bb-d859ef4b03cf"). InnerVolumeSpecName "kube-api-access-4j29t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.206115 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da239a7e-377d-4e6c-b6f8-a159ad929950-kube-api-access-g2ntz" (OuterVolumeSpecName: "kube-api-access-g2ntz") pod "da239a7e-377d-4e6c-b6f8-a159ad929950" (UID: "da239a7e-377d-4e6c-b6f8-a159ad929950"). InnerVolumeSpecName "kube-api-access-g2ntz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.220922 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da239a7e-377d-4e6c-b6f8-a159ad929950-config-data" (OuterVolumeSpecName: "config-data") pod "da239a7e-377d-4e6c-b6f8-a159ad929950" (UID: "da239a7e-377d-4e6c-b6f8-a159ad929950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.222900 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-config-data" (OuterVolumeSpecName: "config-data") pod "59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" (UID: "59ab8561-32d5-4ec9-a2bb-d859ef4b03cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.256891 4842 generic.go:334] "Generic (PLEG): container finished" podID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerID="4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a" exitCode=0 Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.256942 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.256959 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf","Type":"ContainerDied","Data":"4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a"} Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.256983 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"59ab8561-32d5-4ec9-a2bb-d859ef4b03cf","Type":"ContainerDied","Data":"7aa5e72eb43ae9b51ef3cf67f60202fc59348557b3ee3c6e8d7d03377bacd7fe"} Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.256999 4842 scope.go:117] "RemoveContainer" containerID="4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.260054 4842 generic.go:334] "Generic (PLEG): container finished" podID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerID="7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1" exitCode=0 Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.260133 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.260153 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"da239a7e-377d-4e6c-b6f8-a159ad929950","Type":"ContainerDied","Data":"7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1"} Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.260187 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"da239a7e-377d-4e6c-b6f8-a159ad929950","Type":"ContainerDied","Data":"54b634c5a87f836a3f2893362b2cf7acae003ea48f86ec6bf5a17ec514e75134"} Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.261959 4842 generic.go:334] "Generic (PLEG): container finished" podID="57c772b4-865d-48e4-bddd-28346fd4ae3b" containerID="2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" exitCode=0 Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.261982 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"57c772b4-865d-48e4-bddd-28346fd4ae3b","Type":"ContainerDied","Data":"2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3"} Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.261995 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.261999 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"57c772b4-865d-48e4-bddd-28346fd4ae3b","Type":"ContainerDied","Data":"fbc3e4fdb4c9085d3fe2b6dcc5353ae202b13d26f44531abbc290f5b4147c2b5"} Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.276845 4842 scope.go:117] "RemoveContainer" containerID="a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.302960 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.306250 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2ntz\" (UniqueName: \"kubernetes.io/projected/da239a7e-377d-4e6c-b6f8-a159ad929950-kube-api-access-g2ntz\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.306303 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.306319 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da239a7e-377d-4e6c-b6f8-a159ad929950-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.306330 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j29t\" (UniqueName: \"kubernetes.io/projected/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-kube-api-access-4j29t\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.306341 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.306351 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da239a7e-377d-4e6c-b6f8-a159ad929950-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.312995 4842 scope.go:117] "RemoveContainer" containerID="4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.313616 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a\": container with ID starting with 4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a not found: ID does not exist" containerID="4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.313656 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a"} err="failed to get container status \"4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a\": rpc error: code = NotFound desc = could not find container \"4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a\": container with ID starting with 4a4a3945f8fac2992aeae3bf6391c264e9d535a5a31b3b37ea5f8442ee10276a not found: ID does not exist" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.313688 4842 scope.go:117] "RemoveContainer" containerID="a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.314830 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.316892 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3\": container with ID starting with a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3 not found: ID does not exist" containerID="a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.316919 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3"} err="failed to get container status \"a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3\": rpc error: code = NotFound desc = could not find container \"a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3\": container with ID starting with a5678893272d36bc4d64e74ccf810e9bb3e15f177eb7eac40bde6ae2c34e16e3 not found: ID does not exist" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.316941 4842 scope.go:117] "RemoveContainer" containerID="7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.340407 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.349167 4842 scope.go:117] "RemoveContainer" containerID="df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.356532 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.357306 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c772b4-865d-48e4-bddd-28346fd4ae3b" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357323 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c772b4-865d-48e4-bddd-28346fd4ae3b" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.357340 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-log" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357347 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-log" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.357364 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-api" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357372 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-api" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.357383 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-metadata" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357390 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-metadata" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.357400 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042cdb69-b291-4c2d-a73b-e5ed47ee3760" containerName="nova-manage" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357406 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="042cdb69-b291-4c2d-a73b-e5ed47ee3760" containerName="nova-manage" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.357420 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-log" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357428 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-log" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357618 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="042cdb69-b291-4c2d-a73b-e5ed47ee3760" containerName="nova-manage" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357637 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-metadata" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357649 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-api" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357662 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" containerName="nova-kuttl-metadata-log" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357677 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" containerName="nova-kuttl-api-log" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.357694 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c772b4-865d-48e4-bddd-28346fd4ae3b" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.358394 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.369259 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.382306 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.390411 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.395531 4842 scope.go:117] "RemoveContainer" containerID="7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.402799 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1\": container with ID starting with 7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1 not found: ID does not exist" containerID="7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.402841 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1"} err="failed to get container status \"7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1\": rpc error: code = NotFound desc = could not find container \"7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1\": container with ID starting with 7765d95692ea32286218e0a1590e85989d5b568e41e04f66939a3bcd6f160df1 not found: ID does not exist" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.402869 4842 scope.go:117] "RemoveContainer" containerID="df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.403843 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718\": container with ID starting with df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718 not found: ID does not exist" containerID="df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.403918 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718"} err="failed to get container status \"df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718\": rpc error: code = NotFound desc = could not find container \"df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718\": container with ID starting with df165c5485ae57b7af1d563fd89e1d9121e1542469f39f01d39b9eb60b42f718 not found: ID does not exist" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.403950 4842 scope.go:117] "RemoveContainer" containerID="2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.427464 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.428554 4842 scope.go:117] "RemoveContainer" containerID="2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" Mar 11 19:13:35 crc kubenswrapper[4842]: E0311 19:13:35.428997 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3\": container with ID starting with 2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3 not found: ID does not exist" containerID="2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.429042 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3"} err="failed to get container status \"2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3\": rpc error: code = NotFound desc = could not find container \"2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3\": container with ID starting with 2e9158224b0f4e584065b2feca8bf4a70807219ce95ac3758e09f524909cabe3 not found: ID does not exist" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.436536 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.438984 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.440862 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.445530 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.453406 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.461913 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.463289 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.465181 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.469423 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.509325 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9nz2\" (UniqueName: \"kubernetes.io/projected/d74cb58e-301a-4e57-81ee-a24e50e07e13-kube-api-access-l9nz2\") pod \"nova-kuttl-scheduler-0\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.509386 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74cb58e-301a-4e57-81ee-a24e50e07e13-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610549 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56680d2-4bcf-43e4-99b7-18b2838047bb-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610595 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-config-data\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610651 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9nz2\" (UniqueName: \"kubernetes.io/projected/d74cb58e-301a-4e57-81ee-a24e50e07e13-kube-api-access-l9nz2\") pod \"nova-kuttl-scheduler-0\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610676 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-logs\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610697 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56680d2-4bcf-43e4-99b7-18b2838047bb-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610721 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7jq7\" (UniqueName: \"kubernetes.io/projected/c56680d2-4bcf-43e4-99b7-18b2838047bb-kube-api-access-h7jq7\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610761 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74cb58e-301a-4e57-81ee-a24e50e07e13-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.610788 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xd9\" (UniqueName: \"kubernetes.io/projected/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-kube-api-access-r6xd9\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.616200 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74cb58e-301a-4e57-81ee-a24e50e07e13-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.629668 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9nz2\" (UniqueName: \"kubernetes.io/projected/d74cb58e-301a-4e57-81ee-a24e50e07e13-kube-api-access-l9nz2\") pod \"nova-kuttl-scheduler-0\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.685392 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712174 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56680d2-4bcf-43e4-99b7-18b2838047bb-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712225 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-config-data\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712303 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-logs\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712324 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56680d2-4bcf-43e4-99b7-18b2838047bb-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712351 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7jq7\" (UniqueName: \"kubernetes.io/projected/c56680d2-4bcf-43e4-99b7-18b2838047bb-kube-api-access-h7jq7\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712391 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xd9\" (UniqueName: \"kubernetes.io/projected/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-kube-api-access-r6xd9\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712652 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-logs\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.712652 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56680d2-4bcf-43e4-99b7-18b2838047bb-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.716009 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56680d2-4bcf-43e4-99b7-18b2838047bb-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.718668 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-config-data\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.730522 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xd9\" (UniqueName: \"kubernetes.io/projected/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-kube-api-access-r6xd9\") pod \"nova-kuttl-api-0\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.734725 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7jq7\" (UniqueName: \"kubernetes.io/projected/c56680d2-4bcf-43e4-99b7-18b2838047bb-kube-api-access-h7jq7\") pod \"nova-kuttl-metadata-0\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.759846 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:35 crc kubenswrapper[4842]: I0311 19:13:35.778024 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.113559 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:13:36 crc kubenswrapper[4842]: W0311 19:13:36.115263 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68d35f8c_da79_4788_91e2_ac2ee3f9f79b.slice/crio-2f950925016cc57ed138e4a996e86146e9a052f6ed4bf653d41d8408bc5b7abd WatchSource:0}: Error finding container 2f950925016cc57ed138e4a996e86146e9a052f6ed4bf653d41d8408bc5b7abd: Status 404 returned error can't find the container with id 2f950925016cc57ed138e4a996e86146e9a052f6ed4bf653d41d8408bc5b7abd Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.171141 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.260347 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:13:36 crc kubenswrapper[4842]: W0311 19:13:36.264095 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc56680d2_4bcf_43e4_99b7_18b2838047bb.slice/crio-fe0883c887df6dbc6a2ca4a04db6c76df6d816fe718cb8416c91f02b07e9da52 WatchSource:0}: Error finding container fe0883c887df6dbc6a2ca4a04db6c76df6d816fe718cb8416c91f02b07e9da52: Status 404 returned error can't find the container with id fe0883c887df6dbc6a2ca4a04db6c76df6d816fe718cb8416c91f02b07e9da52 Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.274950 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d74cb58e-301a-4e57-81ee-a24e50e07e13","Type":"ContainerStarted","Data":"ad4b5d198dfafa10586ca2c049ff7d85dd19355e37d0460d522f1ae9fe01155c"} Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.276580 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"68d35f8c-da79-4788-91e2-ac2ee3f9f79b","Type":"ContainerStarted","Data":"2f950925016cc57ed138e4a996e86146e9a052f6ed4bf653d41d8408bc5b7abd"} Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.973667 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c772b4-865d-48e4-bddd-28346fd4ae3b" path="/var/lib/kubelet/pods/57c772b4-865d-48e4-bddd-28346fd4ae3b/volumes" Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.974535 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ab8561-32d5-4ec9-a2bb-d859ef4b03cf" path="/var/lib/kubelet/pods/59ab8561-32d5-4ec9-a2bb-d859ef4b03cf/volumes" Mar 11 19:13:36 crc kubenswrapper[4842]: I0311 19:13:36.975345 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da239a7e-377d-4e6c-b6f8-a159ad929950" path="/var/lib/kubelet/pods/da239a7e-377d-4e6c-b6f8-a159ad929950/volumes" Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.298959 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"c56680d2-4bcf-43e4-99b7-18b2838047bb","Type":"ContainerStarted","Data":"dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca"} Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.299020 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"c56680d2-4bcf-43e4-99b7-18b2838047bb","Type":"ContainerStarted","Data":"e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75"} Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.299043 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"c56680d2-4bcf-43e4-99b7-18b2838047bb","Type":"ContainerStarted","Data":"fe0883c887df6dbc6a2ca4a04db6c76df6d816fe718cb8416c91f02b07e9da52"} Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.301698 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d74cb58e-301a-4e57-81ee-a24e50e07e13","Type":"ContainerStarted","Data":"373c51a303721fec9abeebd8f8647af93b99c28494334462581e29e5ff6eee26"} Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.304144 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"68d35f8c-da79-4788-91e2-ac2ee3f9f79b","Type":"ContainerStarted","Data":"fe182fe268a3a54536616ad98fb51448335992a69d170e0821bd01e20a2cbcfc"} Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.304198 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"68d35f8c-da79-4788-91e2-ac2ee3f9f79b","Type":"ContainerStarted","Data":"c4c81403089b4ff965c28c1239a5d468868fca1e026fd9bf32239857f201514a"} Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.334380 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.334337801 podStartE2EDuration="2.334337801s" podCreationTimestamp="2026-03-11 19:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:37.324094245 +0000 UTC m=+1462.971790565" watchObservedRunningTime="2026-03-11 19:13:37.334337801 +0000 UTC m=+1462.982034091" Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.361576 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.361551215 podStartE2EDuration="2.361551215s" podCreationTimestamp="2026-03-11 19:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:37.35877082 +0000 UTC m=+1463.006467120" watchObservedRunningTime="2026-03-11 19:13:37.361551215 +0000 UTC m=+1463.009247535" Mar 11 19:13:37 crc kubenswrapper[4842]: I0311 19:13:37.383940 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.383914188 podStartE2EDuration="2.383914188s" podCreationTimestamp="2026-03-11 19:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:37.37954079 +0000 UTC m=+1463.027237100" watchObservedRunningTime="2026-03-11 19:13:37.383914188 +0000 UTC m=+1463.031610508" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.064127 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h2vxt"] Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.066575 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.078777 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2vxt"] Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.254827 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klhx4\" (UniqueName: \"kubernetes.io/projected/dc346174-b8e6-4317-9352-51e017e3439c-kube-api-access-klhx4\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.254910 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-catalog-content\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.254952 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-utilities\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.356124 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-catalog-content\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.356188 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-utilities\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.356265 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klhx4\" (UniqueName: \"kubernetes.io/projected/dc346174-b8e6-4317-9352-51e017e3439c-kube-api-access-klhx4\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.356676 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-catalog-content\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.356809 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-utilities\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.375952 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klhx4\" (UniqueName: \"kubernetes.io/projected/dc346174-b8e6-4317-9352-51e017e3439c-kube-api-access-klhx4\") pod \"redhat-marketplace-h2vxt\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.389005 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:38 crc kubenswrapper[4842]: I0311 19:13:38.906515 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2vxt"] Mar 11 19:13:38 crc kubenswrapper[4842]: W0311 19:13:38.911302 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc346174_b8e6_4317_9352_51e017e3439c.slice/crio-089f7351c116886096f8a8aba2ae4ce17c70d2aef170f529efe188e779feaba9 WatchSource:0}: Error finding container 089f7351c116886096f8a8aba2ae4ce17c70d2aef170f529efe188e779feaba9: Status 404 returned error can't find the container with id 089f7351c116886096f8a8aba2ae4ce17c70d2aef170f529efe188e779feaba9 Mar 11 19:13:39 crc kubenswrapper[4842]: I0311 19:13:39.328828 4842 generic.go:334] "Generic (PLEG): container finished" podID="dc346174-b8e6-4317-9352-51e017e3439c" containerID="c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff" exitCode=0 Mar 11 19:13:39 crc kubenswrapper[4842]: I0311 19:13:39.328872 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2vxt" event={"ID":"dc346174-b8e6-4317-9352-51e017e3439c","Type":"ContainerDied","Data":"c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff"} Mar 11 19:13:39 crc kubenswrapper[4842]: I0311 19:13:39.329120 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2vxt" event={"ID":"dc346174-b8e6-4317-9352-51e017e3439c","Type":"ContainerStarted","Data":"089f7351c116886096f8a8aba2ae4ce17c70d2aef170f529efe188e779feaba9"} Mar 11 19:13:40 crc kubenswrapper[4842]: I0311 19:13:40.686614 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:41 crc kubenswrapper[4842]: I0311 19:13:41.346914 4842 generic.go:334] "Generic (PLEG): container finished" podID="dc346174-b8e6-4317-9352-51e017e3439c" containerID="5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced" exitCode=0 Mar 11 19:13:41 crc kubenswrapper[4842]: I0311 19:13:41.346960 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2vxt" event={"ID":"dc346174-b8e6-4317-9352-51e017e3439c","Type":"ContainerDied","Data":"5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced"} Mar 11 19:13:42 crc kubenswrapper[4842]: I0311 19:13:42.361431 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2vxt" event={"ID":"dc346174-b8e6-4317-9352-51e017e3439c","Type":"ContainerStarted","Data":"ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1"} Mar 11 19:13:42 crc kubenswrapper[4842]: I0311 19:13:42.389316 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h2vxt" podStartSLOduration=1.8329729559999999 podStartE2EDuration="4.389259459s" podCreationTimestamp="2026-03-11 19:13:38 +0000 UTC" firstStartedPulling="2026-03-11 19:13:39.331391702 +0000 UTC m=+1464.979087982" lastFinishedPulling="2026-03-11 19:13:41.887678195 +0000 UTC m=+1467.535374485" observedRunningTime="2026-03-11 19:13:42.382666591 +0000 UTC m=+1468.030362891" watchObservedRunningTime="2026-03-11 19:13:42.389259459 +0000 UTC m=+1468.036955779" Mar 11 19:13:45 crc kubenswrapper[4842]: I0311 19:13:45.686334 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:45 crc kubenswrapper[4842]: I0311 19:13:45.722606 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:45 crc kubenswrapper[4842]: I0311 19:13:45.760732 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:45 crc kubenswrapper[4842]: I0311 19:13:45.760796 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:45 crc kubenswrapper[4842]: I0311 19:13:45.778396 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:45 crc kubenswrapper[4842]: I0311 19:13:45.778489 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:46 crc kubenswrapper[4842]: I0311 19:13:46.422196 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:13:46 crc kubenswrapper[4842]: I0311 19:13:46.890568 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.145:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:46 crc kubenswrapper[4842]: I0311 19:13:46.931526 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.145:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:46 crc kubenswrapper[4842]: I0311 19:13:46.931869 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.144:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:46 crc kubenswrapper[4842]: I0311 19:13:46.931906 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.144:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:13:47 crc kubenswrapper[4842]: I0311 19:13:47.035078 4842 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod013b01a1-a53b-4c9f-b525-7fc883306119"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod013b01a1-a53b-4c9f-b525-7fc883306119] : Timed out while waiting for systemd to remove kubepods-besteffort-pod013b01a1_a53b_4c9f_b525_7fc883306119.slice" Mar 11 19:13:48 crc kubenswrapper[4842]: I0311 19:13:48.390013 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:48 crc kubenswrapper[4842]: I0311 19:13:48.390255 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:48 crc kubenswrapper[4842]: I0311 19:13:48.437135 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:49 crc kubenswrapper[4842]: I0311 19:13:49.466020 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:51 crc kubenswrapper[4842]: I0311 19:13:51.654701 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2vxt"] Mar 11 19:13:51 crc kubenswrapper[4842]: I0311 19:13:51.655217 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h2vxt" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="registry-server" containerID="cri-o://ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1" gracePeriod=2 Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.076946 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.147027 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klhx4\" (UniqueName: \"kubernetes.io/projected/dc346174-b8e6-4317-9352-51e017e3439c-kube-api-access-klhx4\") pod \"dc346174-b8e6-4317-9352-51e017e3439c\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.149101 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-catalog-content\") pod \"dc346174-b8e6-4317-9352-51e017e3439c\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.149155 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-utilities\") pod \"dc346174-b8e6-4317-9352-51e017e3439c\" (UID: \"dc346174-b8e6-4317-9352-51e017e3439c\") " Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.150414 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-utilities" (OuterVolumeSpecName: "utilities") pod "dc346174-b8e6-4317-9352-51e017e3439c" (UID: "dc346174-b8e6-4317-9352-51e017e3439c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.152204 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.152862 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc346174-b8e6-4317-9352-51e017e3439c-kube-api-access-klhx4" (OuterVolumeSpecName: "kube-api-access-klhx4") pod "dc346174-b8e6-4317-9352-51e017e3439c" (UID: "dc346174-b8e6-4317-9352-51e017e3439c"). InnerVolumeSpecName "kube-api-access-klhx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.177125 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc346174-b8e6-4317-9352-51e017e3439c" (UID: "dc346174-b8e6-4317-9352-51e017e3439c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.255681 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc346174-b8e6-4317-9352-51e017e3439c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.255722 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klhx4\" (UniqueName: \"kubernetes.io/projected/dc346174-b8e6-4317-9352-51e017e3439c-kube-api-access-klhx4\") on node \"crc\" DevicePath \"\"" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.448054 4842 generic.go:334] "Generic (PLEG): container finished" podID="dc346174-b8e6-4317-9352-51e017e3439c" containerID="ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1" exitCode=0 Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.448094 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2vxt" event={"ID":"dc346174-b8e6-4317-9352-51e017e3439c","Type":"ContainerDied","Data":"ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1"} Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.448691 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2vxt" event={"ID":"dc346174-b8e6-4317-9352-51e017e3439c","Type":"ContainerDied","Data":"089f7351c116886096f8a8aba2ae4ce17c70d2aef170f529efe188e779feaba9"} Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.448123 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2vxt" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.448775 4842 scope.go:117] "RemoveContainer" containerID="ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.470503 4842 scope.go:117] "RemoveContainer" containerID="5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.484553 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2vxt"] Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.494126 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2vxt"] Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.515045 4842 scope.go:117] "RemoveContainer" containerID="c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.544287 4842 scope.go:117] "RemoveContainer" containerID="ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1" Mar 11 19:13:52 crc kubenswrapper[4842]: E0311 19:13:52.545168 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1\": container with ID starting with ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1 not found: ID does not exist" containerID="ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.545205 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1"} err="failed to get container status \"ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1\": rpc error: code = NotFound desc = could not find container \"ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1\": container with ID starting with ed49ceff2373eb4da5a56bb1599793f665cdbb012056e5859dacceb38b0244e1 not found: ID does not exist" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.545230 4842 scope.go:117] "RemoveContainer" containerID="5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced" Mar 11 19:13:52 crc kubenswrapper[4842]: E0311 19:13:52.545625 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced\": container with ID starting with 5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced not found: ID does not exist" containerID="5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.545651 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced"} err="failed to get container status \"5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced\": rpc error: code = NotFound desc = could not find container \"5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced\": container with ID starting with 5caf4df158fe8b7b2e8f6a280e14b1962313be363238a64239e88f906eb7aced not found: ID does not exist" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.545668 4842 scope.go:117] "RemoveContainer" containerID="c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff" Mar 11 19:13:52 crc kubenswrapper[4842]: E0311 19:13:52.545930 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff\": container with ID starting with c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff not found: ID does not exist" containerID="c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.545959 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff"} err="failed to get container status \"c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff\": rpc error: code = NotFound desc = could not find container \"c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff\": container with ID starting with c51dd74f49469bced4ff71b804f358abf07d1b55612f2ca8dde7b0c3454aeeff not found: ID does not exist" Mar 11 19:13:52 crc kubenswrapper[4842]: I0311 19:13:52.972771 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc346174-b8e6-4317-9352-51e017e3439c" path="/var/lib/kubelet/pods/dc346174-b8e6-4317-9352-51e017e3439c/volumes" Mar 11 19:13:53 crc kubenswrapper[4842]: I0311 19:13:53.760999 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:53 crc kubenswrapper[4842]: I0311 19:13:53.761055 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:53 crc kubenswrapper[4842]: I0311 19:13:53.778315 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:53 crc kubenswrapper[4842]: I0311 19:13:53.778416 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:55 crc kubenswrapper[4842]: I0311 19:13:55.764123 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:55 crc kubenswrapper[4842]: I0311 19:13:55.764919 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:55 crc kubenswrapper[4842]: I0311 19:13:55.767502 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:55 crc kubenswrapper[4842]: I0311 19:13:55.791238 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:55 crc kubenswrapper[4842]: I0311 19:13:55.810136 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:55 crc kubenswrapper[4842]: I0311 19:13:55.810543 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:56 crc kubenswrapper[4842]: I0311 19:13:56.482498 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:13:56 crc kubenswrapper[4842]: I0311 19:13:56.483667 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.196681 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5"] Mar 11 19:13:58 crc kubenswrapper[4842]: E0311 19:13:58.197313 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="registry-server" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.197325 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="registry-server" Mar 11 19:13:58 crc kubenswrapper[4842]: E0311 19:13:58.197339 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="extract-utilities" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.197345 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="extract-utilities" Mar 11 19:13:58 crc kubenswrapper[4842]: E0311 19:13:58.197355 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="extract-content" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.197361 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="extract-content" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.197519 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc346174-b8e6-4317-9352-51e017e3439c" containerName="registry-server" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.198036 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.200333 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-scripts" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.200544 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-config-data" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.212993 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5"] Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.253758 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-scripts\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.253942 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-config-data\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.254161 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnr2\" (UniqueName: \"kubernetes.io/projected/34e1d8bf-ceef-4519-affb-14fe1769a799-kube-api-access-zgnr2\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.355344 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnr2\" (UniqueName: \"kubernetes.io/projected/34e1d8bf-ceef-4519-affb-14fe1769a799-kube-api-access-zgnr2\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.355439 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-scripts\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.355502 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-config-data\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.362219 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-config-data\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.370826 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-scripts\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.370898 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnr2\" (UniqueName: \"kubernetes.io/projected/34e1d8bf-ceef-4519-affb-14fe1769a799-kube-api-access-zgnr2\") pod \"nova-kuttl-cell1-cell-delete-sk9t5\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.531045 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:13:58 crc kubenswrapper[4842]: I0311 19:13:58.972098 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5"] Mar 11 19:13:59 crc kubenswrapper[4842]: I0311 19:13:59.506198 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" event={"ID":"34e1d8bf-ceef-4519-affb-14fe1769a799","Type":"ContainerStarted","Data":"595f1f96e5649954853bf82aca7f1546a636c69c45cca8e454c1919142527ee8"} Mar 11 19:13:59 crc kubenswrapper[4842]: I0311 19:13:59.507552 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" event={"ID":"34e1d8bf-ceef-4519-affb-14fe1769a799","Type":"ContainerStarted","Data":"18cdfade4299adb5d301f82b8870b2082c4333af4b4b3f9597103c06568f6997"} Mar 11 19:13:59 crc kubenswrapper[4842]: I0311 19:13:59.520559 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" podStartSLOduration=1.520544924 podStartE2EDuration="1.520544924s" podCreationTimestamp="2026-03-11 19:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:13:59.518646322 +0000 UTC m=+1485.166342602" watchObservedRunningTime="2026-03-11 19:13:59.520544924 +0000 UTC m=+1485.168241204" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.129909 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554274-stqtr"] Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.131337 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554274-stqtr" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.134135 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.134350 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.134601 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.138536 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554274-stqtr"] Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.185123 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95bxh\" (UniqueName: \"kubernetes.io/projected/c7a80557-5c7f-4f12-a2a1-60c9d6555fa9-kube-api-access-95bxh\") pod \"auto-csr-approver-29554274-stqtr\" (UID: \"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9\") " pod="openshift-infra/auto-csr-approver-29554274-stqtr" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.257216 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-94wmq"] Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.259111 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.270019 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-94wmq"] Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.288237 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-utilities\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.288313 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95bxh\" (UniqueName: \"kubernetes.io/projected/c7a80557-5c7f-4f12-a2a1-60c9d6555fa9-kube-api-access-95bxh\") pod \"auto-csr-approver-29554274-stqtr\" (UID: \"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9\") " pod="openshift-infra/auto-csr-approver-29554274-stqtr" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.288428 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-catalog-content\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.288489 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq572\" (UniqueName: \"kubernetes.io/projected/a2964bda-6449-4dd7-b3de-335ad767704c-kube-api-access-dq572\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.319872 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95bxh\" (UniqueName: \"kubernetes.io/projected/c7a80557-5c7f-4f12-a2a1-60c9d6555fa9-kube-api-access-95bxh\") pod \"auto-csr-approver-29554274-stqtr\" (UID: \"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9\") " pod="openshift-infra/auto-csr-approver-29554274-stqtr" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.389760 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-catalog-content\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.389837 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq572\" (UniqueName: \"kubernetes.io/projected/a2964bda-6449-4dd7-b3de-335ad767704c-kube-api-access-dq572\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.389914 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-utilities\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.390366 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-catalog-content\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.390415 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-utilities\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.407156 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq572\" (UniqueName: \"kubernetes.io/projected/a2964bda-6449-4dd7-b3de-335ad767704c-kube-api-access-dq572\") pod \"redhat-operators-94wmq\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.451282 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554274-stqtr" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.584355 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:00 crc kubenswrapper[4842]: I0311 19:14:00.721786 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554274-stqtr"] Mar 11 19:14:00 crc kubenswrapper[4842]: W0311 19:14:00.733979 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7a80557_5c7f_4f12_a2a1_60c9d6555fa9.slice/crio-f8f5870264ab5292bd1dd38ce37a934f0effe6681abfaa726cc59b9f335df7d5 WatchSource:0}: Error finding container f8f5870264ab5292bd1dd38ce37a934f0effe6681abfaa726cc59b9f335df7d5: Status 404 returned error can't find the container with id f8f5870264ab5292bd1dd38ce37a934f0effe6681abfaa726cc59b9f335df7d5 Mar 11 19:14:01 crc kubenswrapper[4842]: I0311 19:14:01.046837 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-94wmq"] Mar 11 19:14:01 crc kubenswrapper[4842]: W0311 19:14:01.051029 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2964bda_6449_4dd7_b3de_335ad767704c.slice/crio-80d9ddb7941743ace124d30e30ada5a0f1123ad9cfa3ae88b40d54b0130837b7 WatchSource:0}: Error finding container 80d9ddb7941743ace124d30e30ada5a0f1123ad9cfa3ae88b40d54b0130837b7: Status 404 returned error can't find the container with id 80d9ddb7941743ace124d30e30ada5a0f1123ad9cfa3ae88b40d54b0130837b7 Mar 11 19:14:01 crc kubenswrapper[4842]: I0311 19:14:01.471764 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:14:01 crc kubenswrapper[4842]: I0311 19:14:01.472143 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:14:01 crc kubenswrapper[4842]: I0311 19:14:01.526674 4842 generic.go:334] "Generic (PLEG): container finished" podID="a2964bda-6449-4dd7-b3de-335ad767704c" containerID="4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7" exitCode=0 Mar 11 19:14:01 crc kubenswrapper[4842]: I0311 19:14:01.526726 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94wmq" event={"ID":"a2964bda-6449-4dd7-b3de-335ad767704c","Type":"ContainerDied","Data":"4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7"} Mar 11 19:14:01 crc kubenswrapper[4842]: I0311 19:14:01.527057 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94wmq" event={"ID":"a2964bda-6449-4dd7-b3de-335ad767704c","Type":"ContainerStarted","Data":"80d9ddb7941743ace124d30e30ada5a0f1123ad9cfa3ae88b40d54b0130837b7"} Mar 11 19:14:01 crc kubenswrapper[4842]: I0311 19:14:01.528507 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554274-stqtr" event={"ID":"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9","Type":"ContainerStarted","Data":"f8f5870264ab5292bd1dd38ce37a934f0effe6681abfaa726cc59b9f335df7d5"} Mar 11 19:14:02 crc kubenswrapper[4842]: I0311 19:14:02.538215 4842 generic.go:334] "Generic (PLEG): container finished" podID="c7a80557-5c7f-4f12-a2a1-60c9d6555fa9" containerID="dd94d5f5babba55d95fa252012f301fd612f1f3e22e6c4287831053ffc293669" exitCode=0 Mar 11 19:14:02 crc kubenswrapper[4842]: I0311 19:14:02.538291 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554274-stqtr" event={"ID":"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9","Type":"ContainerDied","Data":"dd94d5f5babba55d95fa252012f301fd612f1f3e22e6c4287831053ffc293669"} Mar 11 19:14:03 crc kubenswrapper[4842]: I0311 19:14:03.550445 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94wmq" event={"ID":"a2964bda-6449-4dd7-b3de-335ad767704c","Type":"ContainerStarted","Data":"bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646"} Mar 11 19:14:03 crc kubenswrapper[4842]: I0311 19:14:03.905940 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554274-stqtr" Mar 11 19:14:03 crc kubenswrapper[4842]: I0311 19:14:03.953913 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95bxh\" (UniqueName: \"kubernetes.io/projected/c7a80557-5c7f-4f12-a2a1-60c9d6555fa9-kube-api-access-95bxh\") pod \"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9\" (UID: \"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9\") " Mar 11 19:14:03 crc kubenswrapper[4842]: I0311 19:14:03.959717 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a80557-5c7f-4f12-a2a1-60c9d6555fa9-kube-api-access-95bxh" (OuterVolumeSpecName: "kube-api-access-95bxh") pod "c7a80557-5c7f-4f12-a2a1-60c9d6555fa9" (UID: "c7a80557-5c7f-4f12-a2a1-60c9d6555fa9"). InnerVolumeSpecName "kube-api-access-95bxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.055939 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95bxh\" (UniqueName: \"kubernetes.io/projected/c7a80557-5c7f-4f12-a2a1-60c9d6555fa9-kube-api-access-95bxh\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.559883 4842 generic.go:334] "Generic (PLEG): container finished" podID="34e1d8bf-ceef-4519-affb-14fe1769a799" containerID="595f1f96e5649954853bf82aca7f1546a636c69c45cca8e454c1919142527ee8" exitCode=0 Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.559947 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" event={"ID":"34e1d8bf-ceef-4519-affb-14fe1769a799","Type":"ContainerDied","Data":"595f1f96e5649954853bf82aca7f1546a636c69c45cca8e454c1919142527ee8"} Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.561633 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554274-stqtr" event={"ID":"c7a80557-5c7f-4f12-a2a1-60c9d6555fa9","Type":"ContainerDied","Data":"f8f5870264ab5292bd1dd38ce37a934f0effe6681abfaa726cc59b9f335df7d5"} Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.561664 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554274-stqtr" Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.561682 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8f5870264ab5292bd1dd38ce37a934f0effe6681abfaa726cc59b9f335df7d5" Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.976182 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554268-spxtq"] Mar 11 19:14:04 crc kubenswrapper[4842]: I0311 19:14:04.983262 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554268-spxtq"] Mar 11 19:14:05 crc kubenswrapper[4842]: I0311 19:14:05.572202 4842 generic.go:334] "Generic (PLEG): container finished" podID="a2964bda-6449-4dd7-b3de-335ad767704c" containerID="bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646" exitCode=0 Mar 11 19:14:05 crc kubenswrapper[4842]: I0311 19:14:05.572302 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94wmq" event={"ID":"a2964bda-6449-4dd7-b3de-335ad767704c","Type":"ContainerDied","Data":"bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646"} Mar 11 19:14:05 crc kubenswrapper[4842]: I0311 19:14:05.920545 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:14:05 crc kubenswrapper[4842]: I0311 19:14:05.986009 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgnr2\" (UniqueName: \"kubernetes.io/projected/34e1d8bf-ceef-4519-affb-14fe1769a799-kube-api-access-zgnr2\") pod \"34e1d8bf-ceef-4519-affb-14fe1769a799\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " Mar 11 19:14:05 crc kubenswrapper[4842]: I0311 19:14:05.986334 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-config-data\") pod \"34e1d8bf-ceef-4519-affb-14fe1769a799\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " Mar 11 19:14:05 crc kubenswrapper[4842]: I0311 19:14:05.986369 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-scripts\") pod \"34e1d8bf-ceef-4519-affb-14fe1769a799\" (UID: \"34e1d8bf-ceef-4519-affb-14fe1769a799\") " Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.004439 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-scripts" (OuterVolumeSpecName: "scripts") pod "34e1d8bf-ceef-4519-affb-14fe1769a799" (UID: "34e1d8bf-ceef-4519-affb-14fe1769a799"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.006679 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e1d8bf-ceef-4519-affb-14fe1769a799-kube-api-access-zgnr2" (OuterVolumeSpecName: "kube-api-access-zgnr2") pod "34e1d8bf-ceef-4519-affb-14fe1769a799" (UID: "34e1d8bf-ceef-4519-affb-14fe1769a799"). InnerVolumeSpecName "kube-api-access-zgnr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.013721 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-config-data" (OuterVolumeSpecName: "config-data") pod "34e1d8bf-ceef-4519-affb-14fe1769a799" (UID: "34e1d8bf-ceef-4519-affb-14fe1769a799"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.088318 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgnr2\" (UniqueName: \"kubernetes.io/projected/34e1d8bf-ceef-4519-affb-14fe1769a799-kube-api-access-zgnr2\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.088606 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.088762 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e1d8bf-ceef-4519-affb-14fe1769a799-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.581981 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" event={"ID":"34e1d8bf-ceef-4519-affb-14fe1769a799","Type":"ContainerDied","Data":"18cdfade4299adb5d301f82b8870b2082c4333af4b4b3f9597103c06568f6997"} Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.582054 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.582736 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18cdfade4299adb5d301f82b8870b2082c4333af4b4b3f9597103c06568f6997" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.863008 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell1b521-account-delete-dr2vl"] Mar 11 19:14:06 crc kubenswrapper[4842]: E0311 19:14:06.863749 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e1d8bf-ceef-4519-affb-14fe1769a799" containerName="nova-manage" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.863774 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e1d8bf-ceef-4519-affb-14fe1769a799" containerName="nova-manage" Mar 11 19:14:06 crc kubenswrapper[4842]: E0311 19:14:06.863796 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a80557-5c7f-4f12-a2a1-60c9d6555fa9" containerName="oc" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.863804 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a80557-5c7f-4f12-a2a1-60c9d6555fa9" containerName="oc" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.863988 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a80557-5c7f-4f12-a2a1-60c9d6555fa9" containerName="oc" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.864014 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e1d8bf-ceef-4519-affb-14fe1769a799" containerName="nova-manage" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.864785 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.874003 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1b521-account-delete-dr2vl"] Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.917830 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj5nk\" (UniqueName: \"kubernetes.io/projected/552b82d5-d576-45a9-94dc-0f653e61346a-kube-api-access-hj5nk\") pod \"novacell1b521-account-delete-dr2vl\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.917898 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/552b82d5-d576-45a9-94dc-0f653e61346a-operator-scripts\") pod \"novacell1b521-account-delete-dr2vl\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:06 crc kubenswrapper[4842]: I0311 19:14:06.977509 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7882c743-aaef-4544-b4c5-65daeadb4fbe" path="/var/lib/kubelet/pods/7882c743-aaef-4544-b4c5-65daeadb4fbe/volumes" Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.006647 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.006911 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="eff533f9-5b87-4048-a157-23b2f93578db" containerName="nova-kuttl-cell1-novncproxy-novncproxy" containerID="cri-o://d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3" gracePeriod=30 Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.021657 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj5nk\" (UniqueName: \"kubernetes.io/projected/552b82d5-d576-45a9-94dc-0f653e61346a-kube-api-access-hj5nk\") pod \"novacell1b521-account-delete-dr2vl\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.021708 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/552b82d5-d576-45a9-94dc-0f653e61346a-operator-scripts\") pod \"novacell1b521-account-delete-dr2vl\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.022540 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/552b82d5-d576-45a9-94dc-0f653e61346a-operator-scripts\") pod \"novacell1b521-account-delete-dr2vl\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.024381 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.024601 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-log" containerID="cri-o://c4c81403089b4ff965c28c1239a5d468868fca1e026fd9bf32239857f201514a" gracePeriod=30 Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.024986 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-api" containerID="cri-o://fe182fe268a3a54536616ad98fb51448335992a69d170e0821bd01e20a2cbcfc" gracePeriod=30 Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.075366 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj5nk\" (UniqueName: \"kubernetes.io/projected/552b82d5-d576-45a9-94dc-0f653e61346a-kube-api-access-hj5nk\") pod \"novacell1b521-account-delete-dr2vl\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.201058 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.201304 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-log" containerID="cri-o://e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75" gracePeriod=30 Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.201455 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca" gracePeriod=30 Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.238634 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.603948 4842 generic.go:334] "Generic (PLEG): container finished" podID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerID="e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75" exitCode=143 Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.604032 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"c56680d2-4bcf-43e4-99b7-18b2838047bb","Type":"ContainerDied","Data":"e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75"} Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.610007 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94wmq" event={"ID":"a2964bda-6449-4dd7-b3de-335ad767704c","Type":"ContainerStarted","Data":"18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0"} Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.614527 4842 generic.go:334] "Generic (PLEG): container finished" podID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerID="c4c81403089b4ff965c28c1239a5d468868fca1e026fd9bf32239857f201514a" exitCode=143 Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.614571 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"68d35f8c-da79-4788-91e2-ac2ee3f9f79b","Type":"ContainerDied","Data":"c4c81403089b4ff965c28c1239a5d468868fca1e026fd9bf32239857f201514a"} Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.634421 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-94wmq" podStartSLOduration=2.821702697 podStartE2EDuration="7.634397071s" podCreationTimestamp="2026-03-11 19:14:00 +0000 UTC" firstStartedPulling="2026-03-11 19:14:01.528804737 +0000 UTC m=+1487.176501017" lastFinishedPulling="2026-03-11 19:14:06.341499111 +0000 UTC m=+1491.989195391" observedRunningTime="2026-03-11 19:14:07.625956499 +0000 UTC m=+1493.273652779" watchObservedRunningTime="2026-03-11 19:14:07.634397071 +0000 UTC m=+1493.282093351" Mar 11 19:14:07 crc kubenswrapper[4842]: I0311 19:14:07.782984 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1b521-account-delete-dr2vl"] Mar 11 19:14:07 crc kubenswrapper[4842]: W0311 19:14:07.806525 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod552b82d5_d576_45a9_94dc_0f653e61346a.slice/crio-86fe102f7335100c7c3aa923c55d5c9cdeef5dd2511848e898b3f940bb01ebf0 WatchSource:0}: Error finding container 86fe102f7335100c7c3aa923c55d5c9cdeef5dd2511848e898b3f940bb01ebf0: Status 404 returned error can't find the container with id 86fe102f7335100c7c3aa923c55d5c9cdeef5dd2511848e898b3f940bb01ebf0 Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.227929 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.352662 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff533f9-5b87-4048-a157-23b2f93578db-config-data\") pod \"eff533f9-5b87-4048-a157-23b2f93578db\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.352749 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/eff533f9-5b87-4048-a157-23b2f93578db-kube-api-access-tvs6s\") pod \"eff533f9-5b87-4048-a157-23b2f93578db\" (UID: \"eff533f9-5b87-4048-a157-23b2f93578db\") " Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.358477 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff533f9-5b87-4048-a157-23b2f93578db-kube-api-access-tvs6s" (OuterVolumeSpecName: "kube-api-access-tvs6s") pod "eff533f9-5b87-4048-a157-23b2f93578db" (UID: "eff533f9-5b87-4048-a157-23b2f93578db"). InnerVolumeSpecName "kube-api-access-tvs6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.374833 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff533f9-5b87-4048-a157-23b2f93578db-config-data" (OuterVolumeSpecName: "config-data") pod "eff533f9-5b87-4048-a157-23b2f93578db" (UID: "eff533f9-5b87-4048-a157-23b2f93578db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.454801 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff533f9-5b87-4048-a157-23b2f93578db-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.454833 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/eff533f9-5b87-4048-a157-23b2f93578db-kube-api-access-tvs6s\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.624346 4842 generic.go:334] "Generic (PLEG): container finished" podID="eff533f9-5b87-4048-a157-23b2f93578db" containerID="d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3" exitCode=0 Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.624421 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"eff533f9-5b87-4048-a157-23b2f93578db","Type":"ContainerDied","Data":"d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3"} Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.624424 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.624447 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"eff533f9-5b87-4048-a157-23b2f93578db","Type":"ContainerDied","Data":"a9822c4b50cd4464bda61edc7b8234c5f790ead8d09c002bb035bcbd668c6b81"} Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.624464 4842 scope.go:117] "RemoveContainer" containerID="d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.626864 4842 generic.go:334] "Generic (PLEG): container finished" podID="552b82d5-d576-45a9-94dc-0f653e61346a" containerID="7ad6f2f52233b63417b42ea2e13ffc051490e74f2473fa21d11a8706cb2cb3a6" exitCode=0 Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.626897 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" event={"ID":"552b82d5-d576-45a9-94dc-0f653e61346a","Type":"ContainerDied","Data":"7ad6f2f52233b63417b42ea2e13ffc051490e74f2473fa21d11a8706cb2cb3a6"} Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.626921 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" event={"ID":"552b82d5-d576-45a9-94dc-0f653e61346a","Type":"ContainerStarted","Data":"86fe102f7335100c7c3aa923c55d5c9cdeef5dd2511848e898b3f940bb01ebf0"} Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.651443 4842 scope.go:117] "RemoveContainer" containerID="d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3" Mar 11 19:14:08 crc kubenswrapper[4842]: E0311 19:14:08.652476 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3\": container with ID starting with d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3 not found: ID does not exist" containerID="d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.652514 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3"} err="failed to get container status \"d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3\": rpc error: code = NotFound desc = could not find container \"d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3\": container with ID starting with d528fc0237ae35bf2bd33d4ac8451f5b7db5db8f80084dfe758878d01e41d1d3 not found: ID does not exist" Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.661455 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.670228 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:14:08 crc kubenswrapper[4842]: I0311 19:14:08.971905 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff533f9-5b87-4048-a157-23b2f93578db" path="/var/lib/kubelet/pods/eff533f9-5b87-4048-a157-23b2f93578db/volumes" Mar 11 19:14:09 crc kubenswrapper[4842]: I0311 19:14:09.986436 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.078608 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj5nk\" (UniqueName: \"kubernetes.io/projected/552b82d5-d576-45a9-94dc-0f653e61346a-kube-api-access-hj5nk\") pod \"552b82d5-d576-45a9-94dc-0f653e61346a\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.078752 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/552b82d5-d576-45a9-94dc-0f653e61346a-operator-scripts\") pod \"552b82d5-d576-45a9-94dc-0f653e61346a\" (UID: \"552b82d5-d576-45a9-94dc-0f653e61346a\") " Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.080034 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552b82d5-d576-45a9-94dc-0f653e61346a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "552b82d5-d576-45a9-94dc-0f653e61346a" (UID: "552b82d5-d576-45a9-94dc-0f653e61346a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.083923 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552b82d5-d576-45a9-94dc-0f653e61346a-kube-api-access-hj5nk" (OuterVolumeSpecName: "kube-api-access-hj5nk") pod "552b82d5-d576-45a9-94dc-0f653e61346a" (UID: "552b82d5-d576-45a9-94dc-0f653e61346a"). InnerVolumeSpecName "kube-api-access-hj5nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.180649 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj5nk\" (UniqueName: \"kubernetes.io/projected/552b82d5-d576-45a9-94dc-0f653e61346a-kube-api-access-hj5nk\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.180681 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/552b82d5-d576-45a9-94dc-0f653e61346a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.586187 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.586246 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.645452 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" event={"ID":"552b82d5-d576-45a9-94dc-0f653e61346a","Type":"ContainerDied","Data":"86fe102f7335100c7c3aa923c55d5c9cdeef5dd2511848e898b3f940bb01ebf0"} Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.645782 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86fe102f7335100c7c3aa923c55d5c9cdeef5dd2511848e898b3f940bb01ebf0" Mar 11 19:14:10 crc kubenswrapper[4842]: I0311 19:14:10.645499 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1b521-account-delete-dr2vl" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.089262 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.194103 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56680d2-4bcf-43e4-99b7-18b2838047bb-config-data\") pod \"c56680d2-4bcf-43e4-99b7-18b2838047bb\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.194436 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56680d2-4bcf-43e4-99b7-18b2838047bb-logs\") pod \"c56680d2-4bcf-43e4-99b7-18b2838047bb\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.194579 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7jq7\" (UniqueName: \"kubernetes.io/projected/c56680d2-4bcf-43e4-99b7-18b2838047bb-kube-api-access-h7jq7\") pod \"c56680d2-4bcf-43e4-99b7-18b2838047bb\" (UID: \"c56680d2-4bcf-43e4-99b7-18b2838047bb\") " Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.194992 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56680d2-4bcf-43e4-99b7-18b2838047bb-logs" (OuterVolumeSpecName: "logs") pod "c56680d2-4bcf-43e4-99b7-18b2838047bb" (UID: "c56680d2-4bcf-43e4-99b7-18b2838047bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.199563 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56680d2-4bcf-43e4-99b7-18b2838047bb-kube-api-access-h7jq7" (OuterVolumeSpecName: "kube-api-access-h7jq7") pod "c56680d2-4bcf-43e4-99b7-18b2838047bb" (UID: "c56680d2-4bcf-43e4-99b7-18b2838047bb"). InnerVolumeSpecName "kube-api-access-h7jq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.215936 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56680d2-4bcf-43e4-99b7-18b2838047bb-config-data" (OuterVolumeSpecName: "config-data") pod "c56680d2-4bcf-43e4-99b7-18b2838047bb" (UID: "c56680d2-4bcf-43e4-99b7-18b2838047bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.296284 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56680d2-4bcf-43e4-99b7-18b2838047bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.296590 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56680d2-4bcf-43e4-99b7-18b2838047bb-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.296608 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7jq7\" (UniqueName: \"kubernetes.io/projected/c56680d2-4bcf-43e4-99b7-18b2838047bb-kube-api-access-h7jq7\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.470679 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.471015 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="d74cb58e-301a-4e57-81ee-a24e50e07e13" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://373c51a303721fec9abeebd8f8647af93b99c28494334462581e29e5ff6eee26" gracePeriod=30 Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.558652 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.558903 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="72beb476-94c1-4b97-bb3a-544abc447548" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" gracePeriod=30 Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.571636 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.578611 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-ccv28"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.643079 4842 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-94wmq" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="registry-server" probeResult="failure" output=< Mar 11 19:14:11 crc kubenswrapper[4842]: timeout: failed to connect service ":50051" within 1s Mar 11 19:14:11 crc kubenswrapper[4842]: > Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.657299 4842 generic.go:334] "Generic (PLEG): container finished" podID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerID="dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca" exitCode=0 Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.657361 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"c56680d2-4bcf-43e4-99b7-18b2838047bb","Type":"ContainerDied","Data":"dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca"} Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.657388 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"c56680d2-4bcf-43e4-99b7-18b2838047bb","Type":"ContainerDied","Data":"fe0883c887df6dbc6a2ca4a04db6c76df6d816fe718cb8416c91f02b07e9da52"} Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.657603 4842 scope.go:117] "RemoveContainer" containerID="dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.657750 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.664856 4842 generic.go:334] "Generic (PLEG): container finished" podID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerID="fe182fe268a3a54536616ad98fb51448335992a69d170e0821bd01e20a2cbcfc" exitCode=0 Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.664907 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"68d35f8c-da79-4788-91e2-ac2ee3f9f79b","Type":"ContainerDied","Data":"fe182fe268a3a54536616ad98fb51448335992a69d170e0821bd01e20a2cbcfc"} Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.697537 4842 scope.go:117] "RemoveContainer" containerID="e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.702035 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.721802 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.726114 4842 scope.go:117] "RemoveContainer" containerID="dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca" Mar 11 19:14:11 crc kubenswrapper[4842]: E0311 19:14:11.726863 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca\": container with ID starting with dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca not found: ID does not exist" containerID="dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.726925 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca"} err="failed to get container status \"dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca\": rpc error: code = NotFound desc = could not find container \"dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca\": container with ID starting with dc8ebe62d33d05a602ee7ff3a5c4d47378a9a86775e5bced8fe9bb2e49cc77ca not found: ID does not exist" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.726955 4842 scope.go:117] "RemoveContainer" containerID="e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75" Mar 11 19:14:11 crc kubenswrapper[4842]: E0311 19:14:11.729173 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75\": container with ID starting with e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75 not found: ID does not exist" containerID="e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.729216 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75"} err="failed to get container status \"e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75\": rpc error: code = NotFound desc = could not find container \"e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75\": container with ID starting with e2ac2b32f3f137aae3bf0dbf592be94b83672070f4cd990c3316a39fde7e4a75 not found: ID does not exist" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743326 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:11 crc kubenswrapper[4842]: E0311 19:14:11.743718 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-log" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743734 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-log" Mar 11 19:14:11 crc kubenswrapper[4842]: E0311 19:14:11.743756 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-metadata" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743763 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-metadata" Mar 11 19:14:11 crc kubenswrapper[4842]: E0311 19:14:11.743774 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552b82d5-d576-45a9-94dc-0f653e61346a" containerName="mariadb-account-delete" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743782 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="552b82d5-d576-45a9-94dc-0f653e61346a" containerName="mariadb-account-delete" Mar 11 19:14:11 crc kubenswrapper[4842]: E0311 19:14:11.743796 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff533f9-5b87-4048-a157-23b2f93578db" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743803 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff533f9-5b87-4048-a157-23b2f93578db" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743959 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-log" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743973 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" containerName="nova-kuttl-metadata-metadata" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743984 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff533f9-5b87-4048-a157-23b2f93578db" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.743998 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="552b82d5-d576-45a9-94dc-0f653e61346a" containerName="mariadb-account-delete" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.744848 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.753509 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.755880 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.799403 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.893433 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-kx6kb"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.904174 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-kx6kb"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.914109 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell1b521-account-delete-dr2vl"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.915987 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6xd9\" (UniqueName: \"kubernetes.io/projected/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-kube-api-access-r6xd9\") pod \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.916058 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-config-data\") pod \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.916106 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-logs\") pod \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\" (UID: \"68d35f8c-da79-4788-91e2-ac2ee3f9f79b\") " Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.916381 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.916422 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrcq\" (UniqueName: \"kubernetes.io/projected/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-kube-api-access-rbrcq\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.916488 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.918184 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-logs" (OuterVolumeSpecName: "logs") pod "68d35f8c-da79-4788-91e2-ac2ee3f9f79b" (UID: "68d35f8c-da79-4788-91e2-ac2ee3f9f79b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.920768 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.922388 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-kube-api-access-r6xd9" (OuterVolumeSpecName: "kube-api-access-r6xd9") pod "68d35f8c-da79-4788-91e2-ac2ee3f9f79b" (UID: "68d35f8c-da79-4788-91e2-ac2ee3f9f79b"). InnerVolumeSpecName "kube-api-access-r6xd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.934085 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell1b521-account-delete-dr2vl"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.935302 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-b521-account-create-update-8tl7p"] Mar 11 19:14:11 crc kubenswrapper[4842]: I0311 19:14:11.937108 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-config-data" (OuterVolumeSpecName: "config-data") pod "68d35f8c-da79-4788-91e2-ac2ee3f9f79b" (UID: "68d35f8c-da79-4788-91e2-ac2ee3f9f79b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.017548 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.017591 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrcq\" (UniqueName: \"kubernetes.io/projected/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-kube-api-access-rbrcq\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.017649 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.017721 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6xd9\" (UniqueName: \"kubernetes.io/projected/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-kube-api-access-r6xd9\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.017733 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.017741 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68d35f8c-da79-4788-91e2-ac2ee3f9f79b-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.018309 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.022929 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:12 crc kubenswrapper[4842]: E0311 19:14:12.551992 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:14:12 crc kubenswrapper[4842]: E0311 19:14:12.576947 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.577156 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrcq\" (UniqueName: \"kubernetes.io/projected/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-kube-api-access-rbrcq\") pod \"nova-kuttl-metadata-0\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:12 crc kubenswrapper[4842]: E0311 19:14:12.578552 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:14:12 crc kubenswrapper[4842]: E0311 19:14:12.578597 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="72beb476-94c1-4b97-bb3a-544abc447548" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.673105 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"68d35f8c-da79-4788-91e2-ac2ee3f9f79b","Type":"ContainerDied","Data":"2f950925016cc57ed138e4a996e86146e9a052f6ed4bf653d41d8408bc5b7abd"} Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.673165 4842 scope.go:117] "RemoveContainer" containerID="fe182fe268a3a54536616ad98fb51448335992a69d170e0821bd01e20a2cbcfc" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.673316 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.695672 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.711023 4842 scope.go:117] "RemoveContainer" containerID="c4c81403089b4ff965c28c1239a5d468868fca1e026fd9bf32239857f201514a" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.712168 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.722351 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.746949 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:12 crc kubenswrapper[4842]: E0311 19:14:12.747395 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-log" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.747413 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-log" Mar 11 19:14:12 crc kubenswrapper[4842]: E0311 19:14:12.747437 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-api" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.747446 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-api" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.747652 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-log" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.747673 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-api" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.748782 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.751165 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.757764 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.850908 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-logs\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.851240 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.851360 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65952\" (UniqueName: \"kubernetes.io/projected/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-kube-api-access-65952\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.953468 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65952\" (UniqueName: \"kubernetes.io/projected/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-kube-api-access-65952\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.953595 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-logs\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.953626 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.955392 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-logs\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.968409 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.974976 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f912a68-5aee-4b66-9c2b-a25bc7736725" path="/var/lib/kubelet/pods/0f912a68-5aee-4b66-9c2b-a25bc7736725/volumes" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.975492 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552b82d5-d576-45a9-94dc-0f653e61346a" path="/var/lib/kubelet/pods/552b82d5-d576-45a9-94dc-0f653e61346a/volumes" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.976048 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" path="/var/lib/kubelet/pods/68d35f8c-da79-4788-91e2-ac2ee3f9f79b/volumes" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.977074 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65952\" (UniqueName: \"kubernetes.io/projected/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-kube-api-access-65952\") pod \"nova-kuttl-api-0\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.977102 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a073ef4-9c1e-481a-aa9a-405e4892e3ef" path="/var/lib/kubelet/pods/6a073ef4-9c1e-481a-aa9a-405e4892e3ef/volumes" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.977643 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8fbb24-c60f-4143-8811-f56af287ace2" path="/var/lib/kubelet/pods/af8fbb24-c60f-4143-8811-f56af287ace2/volumes" Mar 11 19:14:12 crc kubenswrapper[4842]: I0311 19:14:12.979261 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56680d2-4bcf-43e4-99b7-18b2838047bb" path="/var/lib/kubelet/pods/c56680d2-4bcf-43e4-99b7-18b2838047bb/volumes" Mar 11 19:14:13 crc kubenswrapper[4842]: I0311 19:14:13.104925 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:13 crc kubenswrapper[4842]: I0311 19:14:13.263389 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:13 crc kubenswrapper[4842]: I0311 19:14:13.565162 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:13 crc kubenswrapper[4842]: I0311 19:14:13.688912 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd","Type":"ContainerStarted","Data":"cb8205d05ddf3c6c91628b4b00a6ea5a405bf33e0b4994df2604b74f75cefff6"} Mar 11 19:14:13 crc kubenswrapper[4842]: I0311 19:14:13.694827 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc","Type":"ContainerStarted","Data":"0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3"} Mar 11 19:14:13 crc kubenswrapper[4842]: I0311 19:14:13.694886 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc","Type":"ContainerStarted","Data":"5d0a7458342c17eeae057ea54b6931414fdcf73b3b4e1a9b52b67af56bed5b88"} Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.710557 4842 generic.go:334] "Generic (PLEG): container finished" podID="d74cb58e-301a-4e57-81ee-a24e50e07e13" containerID="373c51a303721fec9abeebd8f8647af93b99c28494334462581e29e5ff6eee26" exitCode=0 Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.710635 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d74cb58e-301a-4e57-81ee-a24e50e07e13","Type":"ContainerDied","Data":"373c51a303721fec9abeebd8f8647af93b99c28494334462581e29e5ff6eee26"} Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.712496 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc","Type":"ContainerStarted","Data":"209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1"} Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.714074 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd","Type":"ContainerStarted","Data":"2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745"} Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.714104 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd","Type":"ContainerStarted","Data":"826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5"} Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.734753 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=3.734732163 podStartE2EDuration="3.734732163s" podCreationTimestamp="2026-03-11 19:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:14.726886877 +0000 UTC m=+1500.374583157" watchObservedRunningTime="2026-03-11 19:14:14.734732163 +0000 UTC m=+1500.382428443" Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.747152 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.74713453 podStartE2EDuration="2.74713453s" podCreationTimestamp="2026-03-11 19:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:14.743471353 +0000 UTC m=+1500.391167633" watchObservedRunningTime="2026-03-11 19:14:14.74713453 +0000 UTC m=+1500.394830820" Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.835310 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.988574 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74cb58e-301a-4e57-81ee-a24e50e07e13-config-data\") pod \"d74cb58e-301a-4e57-81ee-a24e50e07e13\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " Mar 11 19:14:14 crc kubenswrapper[4842]: I0311 19:14:14.988653 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9nz2\" (UniqueName: \"kubernetes.io/projected/d74cb58e-301a-4e57-81ee-a24e50e07e13-kube-api-access-l9nz2\") pod \"d74cb58e-301a-4e57-81ee-a24e50e07e13\" (UID: \"d74cb58e-301a-4e57-81ee-a24e50e07e13\") " Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.010639 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74cb58e-301a-4e57-81ee-a24e50e07e13-kube-api-access-l9nz2" (OuterVolumeSpecName: "kube-api-access-l9nz2") pod "d74cb58e-301a-4e57-81ee-a24e50e07e13" (UID: "d74cb58e-301a-4e57-81ee-a24e50e07e13"). InnerVolumeSpecName "kube-api-access-l9nz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.017177 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74cb58e-301a-4e57-81ee-a24e50e07e13-config-data" (OuterVolumeSpecName: "config-data") pod "d74cb58e-301a-4e57-81ee-a24e50e07e13" (UID: "d74cb58e-301a-4e57-81ee-a24e50e07e13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.091791 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74cb58e-301a-4e57-81ee-a24e50e07e13-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.092144 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9nz2\" (UniqueName: \"kubernetes.io/projected/d74cb58e-301a-4e57-81ee-a24e50e07e13-kube-api-access-l9nz2\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.727071 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d74cb58e-301a-4e57-81ee-a24e50e07e13","Type":"ContainerDied","Data":"ad4b5d198dfafa10586ca2c049ff7d85dd19355e37d0460d522f1ae9fe01155c"} Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.727174 4842 scope.go:117] "RemoveContainer" containerID="373c51a303721fec9abeebd8f8647af93b99c28494334462581e29e5ff6eee26" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.727191 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.771751 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.780605 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.805840 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:15 crc kubenswrapper[4842]: E0311 19:14:15.806341 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74cb58e-301a-4e57-81ee-a24e50e07e13" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.806367 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74cb58e-301a-4e57-81ee-a24e50e07e13" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.806560 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74cb58e-301a-4e57-81ee-a24e50e07e13" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.807237 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.809861 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.832477 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.906211 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgp4\" (UniqueName: \"kubernetes.io/projected/d17813a3-ec29-4465-8a6a-9fa8165b335c-kube-api-access-tqgp4\") pod \"nova-kuttl-scheduler-0\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:15 crc kubenswrapper[4842]: I0311 19:14:15.906359 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17813a3-ec29-4465-8a6a-9fa8165b335c-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.008286 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17813a3-ec29-4465-8a6a-9fa8165b335c-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.008407 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgp4\" (UniqueName: \"kubernetes.io/projected/d17813a3-ec29-4465-8a6a-9fa8165b335c-kube-api-access-tqgp4\") pod \"nova-kuttl-scheduler-0\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.019652 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17813a3-ec29-4465-8a6a-9fa8165b335c-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.024754 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgp4\" (UniqueName: \"kubernetes.io/projected/d17813a3-ec29-4465-8a6a-9fa8165b335c-kube-api-access-tqgp4\") pod \"nova-kuttl-scheduler-0\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.143593 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.620568 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.739708 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d17813a3-ec29-4465-8a6a-9fa8165b335c","Type":"ContainerStarted","Data":"01b0fb3b8eb4fb6df4dc13c4c6728ff546f813f203643388ac09c56b45150d45"} Mar 11 19:14:16 crc kubenswrapper[4842]: I0311 19:14:16.973840 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74cb58e-301a-4e57-81ee-a24e50e07e13" path="/var/lib/kubelet/pods/d74cb58e-301a-4e57-81ee-a24e50e07e13/volumes" Mar 11 19:14:17 crc kubenswrapper[4842]: E0311 19:14:17.334473 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:14:17 crc kubenswrapper[4842]: E0311 19:14:17.337579 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:14:17 crc kubenswrapper[4842]: E0311 19:14:17.340315 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:14:17 crc kubenswrapper[4842]: E0311 19:14:17.340442 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="72beb476-94c1-4b97-bb3a-544abc447548" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:14:17 crc kubenswrapper[4842]: I0311 19:14:17.819773 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d17813a3-ec29-4465-8a6a-9fa8165b335c","Type":"ContainerStarted","Data":"944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1"} Mar 11 19:14:17 crc kubenswrapper[4842]: I0311 19:14:17.868183 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.86815374 podStartE2EDuration="2.86815374s" podCreationTimestamp="2026-03-11 19:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:17.861847174 +0000 UTC m=+1503.509543484" watchObservedRunningTime="2026-03-11 19:14:17.86815374 +0000 UTC m=+1503.515850010" Mar 11 19:14:20 crc kubenswrapper[4842]: I0311 19:14:20.637375 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:20 crc kubenswrapper[4842]: I0311 19:14:20.693201 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.144475 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.465882 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.632606 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9cpz\" (UniqueName: \"kubernetes.io/projected/72beb476-94c1-4b97-bb3a-544abc447548-kube-api-access-x9cpz\") pod \"72beb476-94c1-4b97-bb3a-544abc447548\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.632714 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72beb476-94c1-4b97-bb3a-544abc447548-config-data\") pod \"72beb476-94c1-4b97-bb3a-544abc447548\" (UID: \"72beb476-94c1-4b97-bb3a-544abc447548\") " Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.640091 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72beb476-94c1-4b97-bb3a-544abc447548-kube-api-access-x9cpz" (OuterVolumeSpecName: "kube-api-access-x9cpz") pod "72beb476-94c1-4b97-bb3a-544abc447548" (UID: "72beb476-94c1-4b97-bb3a-544abc447548"). InnerVolumeSpecName "kube-api-access-x9cpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.659687 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72beb476-94c1-4b97-bb3a-544abc447548-config-data" (OuterVolumeSpecName: "config-data") pod "72beb476-94c1-4b97-bb3a-544abc447548" (UID: "72beb476-94c1-4b97-bb3a-544abc447548"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.735750 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9cpz\" (UniqueName: \"kubernetes.io/projected/72beb476-94c1-4b97-bb3a-544abc447548-kube-api-access-x9cpz\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.736352 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72beb476-94c1-4b97-bb3a-544abc447548-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.862651 4842 generic.go:334] "Generic (PLEG): container finished" podID="72beb476-94c1-4b97-bb3a-544abc447548" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" exitCode=0 Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.862719 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"72beb476-94c1-4b97-bb3a-544abc447548","Type":"ContainerDied","Data":"ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3"} Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.862760 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"72beb476-94c1-4b97-bb3a-544abc447548","Type":"ContainerDied","Data":"b6ab19762c847a6837d092cefb3e159edc6a74d365ce2d55db4ac764160fb7b7"} Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.862791 4842 scope.go:117] "RemoveContainer" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.862966 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.887657 4842 scope.go:117] "RemoveContainer" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" Mar 11 19:14:21 crc kubenswrapper[4842]: E0311 19:14:21.888224 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3\": container with ID starting with ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3 not found: ID does not exist" containerID="ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.888435 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3"} err="failed to get container status \"ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3\": rpc error: code = NotFound desc = could not find container \"ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3\": container with ID starting with ad83c9e68d61d1dfaf3f3eceeb7f1929f4cbac4bd1a64de74108abb0d66b86c3 not found: ID does not exist" Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.916063 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:14:21 crc kubenswrapper[4842]: I0311 19:14:21.926166 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:14:22 crc kubenswrapper[4842]: I0311 19:14:22.696409 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:22 crc kubenswrapper[4842]: I0311 19:14:22.696477 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:22 crc kubenswrapper[4842]: I0311 19:14:22.970149 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72beb476-94c1-4b97-bb3a-544abc447548" path="/var/lib/kubelet/pods/72beb476-94c1-4b97-bb3a-544abc447548/volumes" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.046600 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-94wmq"] Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.047213 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-94wmq" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="registry-server" containerID="cri-o://18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0" gracePeriod=2 Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.105367 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.105677 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.563860 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.680654 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-catalog-content\") pod \"a2964bda-6449-4dd7-b3de-335ad767704c\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.680708 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-utilities\") pod \"a2964bda-6449-4dd7-b3de-335ad767704c\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.680808 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq572\" (UniqueName: \"kubernetes.io/projected/a2964bda-6449-4dd7-b3de-335ad767704c-kube-api-access-dq572\") pod \"a2964bda-6449-4dd7-b3de-335ad767704c\" (UID: \"a2964bda-6449-4dd7-b3de-335ad767704c\") " Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.683584 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-utilities" (OuterVolumeSpecName: "utilities") pod "a2964bda-6449-4dd7-b3de-335ad767704c" (UID: "a2964bda-6449-4dd7-b3de-335ad767704c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.685559 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2964bda-6449-4dd7-b3de-335ad767704c-kube-api-access-dq572" (OuterVolumeSpecName: "kube-api-access-dq572") pod "a2964bda-6449-4dd7-b3de-335ad767704c" (UID: "a2964bda-6449-4dd7-b3de-335ad767704c"). InnerVolumeSpecName "kube-api-access-dq572". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.778482 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.151:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.778540 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.151:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.783166 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.783213 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq572\" (UniqueName: \"kubernetes.io/projected/a2964bda-6449-4dd7-b3de-335ad767704c-kube-api-access-dq572\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.820595 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2964bda-6449-4dd7-b3de-335ad767704c" (UID: "a2964bda-6449-4dd7-b3de-335ad767704c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.885149 4842 generic.go:334] "Generic (PLEG): container finished" podID="a2964bda-6449-4dd7-b3de-335ad767704c" containerID="18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0" exitCode=0 Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.888426 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94wmq" event={"ID":"a2964bda-6449-4dd7-b3de-335ad767704c","Type":"ContainerDied","Data":"18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0"} Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.888478 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94wmq" event={"ID":"a2964bda-6449-4dd7-b3de-335ad767704c","Type":"ContainerDied","Data":"80d9ddb7941743ace124d30e30ada5a0f1123ad9cfa3ae88b40d54b0130837b7"} Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.888491 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94wmq" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.888497 4842 scope.go:117] "RemoveContainer" containerID="18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.898073 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2964bda-6449-4dd7-b3de-335ad767704c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.909958 4842 scope.go:117] "RemoveContainer" containerID="bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.926523 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-94wmq"] Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.932051 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-94wmq"] Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.941705 4842 scope.go:117] "RemoveContainer" containerID="4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.974079 4842 scope.go:117] "RemoveContainer" containerID="18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0" Mar 11 19:14:23 crc kubenswrapper[4842]: E0311 19:14:23.974619 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0\": container with ID starting with 18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0 not found: ID does not exist" containerID="18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.974666 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0"} err="failed to get container status \"18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0\": rpc error: code = NotFound desc = could not find container \"18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0\": container with ID starting with 18353412455e2d06949dd1ef568d13da45099de1993a506b259aa27253f5dff0 not found: ID does not exist" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.974692 4842 scope.go:117] "RemoveContainer" containerID="bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646" Mar 11 19:14:23 crc kubenswrapper[4842]: E0311 19:14:23.975103 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646\": container with ID starting with bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646 not found: ID does not exist" containerID="bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.975139 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646"} err="failed to get container status \"bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646\": rpc error: code = NotFound desc = could not find container \"bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646\": container with ID starting with bcd09bfa96adbbc93d23f4817808bc1deef3ac9a72617acdee1a1e072c5fc646 not found: ID does not exist" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.975162 4842 scope.go:117] "RemoveContainer" containerID="4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7" Mar 11 19:14:23 crc kubenswrapper[4842]: E0311 19:14:23.975495 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7\": container with ID starting with 4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7 not found: ID does not exist" containerID="4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7" Mar 11 19:14:23 crc kubenswrapper[4842]: I0311 19:14:23.975526 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7"} err="failed to get container status \"4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7\": rpc error: code = NotFound desc = could not find container \"4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7\": container with ID starting with 4e3b7be8882e417f927d7262f29ecc4ff59f1b043a5123f5b9a87f511006aca7 not found: ID does not exist" Mar 11 19:14:24 crc kubenswrapper[4842]: I0311 19:14:24.146688 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.152:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:14:24 crc kubenswrapper[4842]: I0311 19:14:24.146701 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.152:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:14:24 crc kubenswrapper[4842]: I0311 19:14:24.973362 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" path="/var/lib/kubelet/pods/a2964bda-6449-4dd7-b3de-335ad767704c/volumes" Mar 11 19:14:26 crc kubenswrapper[4842]: I0311 19:14:26.144495 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:26 crc kubenswrapper[4842]: I0311 19:14:26.179839 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:26 crc kubenswrapper[4842]: I0311 19:14:26.958237 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:27 crc kubenswrapper[4842]: I0311 19:14:27.344204 4842 scope.go:117] "RemoveContainer" containerID="9077efd6e8c02cda76448cdcf73db16153069e6c8b272a428484e199e5a5a97a" Mar 11 19:14:29 crc kubenswrapper[4842]: I0311 19:14:29.779870 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.145:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:14:29 crc kubenswrapper[4842]: I0311 19:14:29.779790 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="68d35f8c-da79-4788-91e2-ac2ee3f9f79b" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.145:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:14:30 crc kubenswrapper[4842]: I0311 19:14:30.696971 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:30 crc kubenswrapper[4842]: I0311 19:14:30.698091 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:31 crc kubenswrapper[4842]: I0311 19:14:31.105323 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:31 crc kubenswrapper[4842]: I0311 19:14:31.105392 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:31 crc kubenswrapper[4842]: I0311 19:14:31.472829 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:14:31 crc kubenswrapper[4842]: I0311 19:14:31.472913 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:14:32 crc kubenswrapper[4842]: I0311 19:14:32.699749 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:32 crc kubenswrapper[4842]: I0311 19:14:32.700209 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:32 crc kubenswrapper[4842]: I0311 19:14:32.702743 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:32 crc kubenswrapper[4842]: I0311 19:14:32.702909 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:33 crc kubenswrapper[4842]: I0311 19:14:33.110123 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:33 crc kubenswrapper[4842]: I0311 19:14:33.111583 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:33 crc kubenswrapper[4842]: I0311 19:14:33.114252 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:33 crc kubenswrapper[4842]: I0311 19:14:33.996875 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:35 crc kubenswrapper[4842]: I0311 19:14:35.975553 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5"] Mar 11 19:14:35 crc kubenswrapper[4842]: I0311 19:14:35.987310 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-delete-sk9t5"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.001283 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.015836 4842 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="nova-kuttl-default/nova-kuttl-api-0" secret="" err="secret \"nova-nova-kuttl-dockercfg-kvrtt\" not found" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.026079 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.043348 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-dt5dh"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.049536 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-v7mgg"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.113520 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.123208 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.123572 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-log" containerID="cri-o://0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3" gracePeriod=30 Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.124063 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1" gracePeriod=30 Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.136847 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novaapi3968-account-delete-sr2nh"] Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.137210 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="extract-content" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.137232 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="extract-content" Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.137265 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72beb476-94c1-4b97-bb3a-544abc447548" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.137295 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="72beb476-94c1-4b97-bb3a-544abc447548" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.137309 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="extract-utilities" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.137317 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="extract-utilities" Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.137330 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="registry-server" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.137340 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="registry-server" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.137527 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2964bda-6449-4dd7-b3de-335ad767704c" containerName="registry-server" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.137547 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="72beb476-94c1-4b97-bb3a-544abc447548" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.138589 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.143531 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-api-config-data: secret "nova-kuttl-api-config-data" not found Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.143630 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data podName:bf3b5b7d-fb77-400c-97f2-f10c5064ffbd nodeName:}" failed. No retries permitted until 2026-03-11 19:14:36.643607107 +0000 UTC m=+1522.291303437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data") pod "nova-kuttl-api-0" (UID: "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd") : secret "nova-kuttl-api-config-data" not found Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.152855 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapi3968-account-delete-sr2nh"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.228416 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.228667 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="d17813a3-ec29-4465-8a6a-9fa8165b335c" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1" gracePeriod=30 Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.238096 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell0a055-account-delete-qsx7n"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.239560 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.244330 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn72m\" (UniqueName: \"kubernetes.io/projected/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-kube-api-access-tn72m\") pod \"novaapi3968-account-delete-sr2nh\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.244393 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-operator-scripts\") pod \"novaapi3968-account-delete-sr2nh\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.254085 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell0a055-account-delete-qsx7n"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.345513 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn72m\" (UniqueName: \"kubernetes.io/projected/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-kube-api-access-tn72m\") pod \"novaapi3968-account-delete-sr2nh\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.346465 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-operator-scripts\") pod \"novaapi3968-account-delete-sr2nh\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.346556 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-operator-scripts\") pod \"novacell0a055-account-delete-qsx7n\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.346679 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ngc\" (UniqueName: \"kubernetes.io/projected/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-kube-api-access-l4ngc\") pod \"novacell0a055-account-delete-qsx7n\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.347976 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-operator-scripts\") pod \"novaapi3968-account-delete-sr2nh\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.371930 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn72m\" (UniqueName: \"kubernetes.io/projected/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-kube-api-access-tn72m\") pod \"novaapi3968-account-delete-sr2nh\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.448145 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-operator-scripts\") pod \"novacell0a055-account-delete-qsx7n\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.448236 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ngc\" (UniqueName: \"kubernetes.io/projected/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-kube-api-access-l4ngc\") pod \"novacell0a055-account-delete-qsx7n\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.448862 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-operator-scripts\") pod \"novacell0a055-account-delete-qsx7n\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.466139 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4ngc\" (UniqueName: \"kubernetes.io/projected/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-kube-api-access-l4ngc\") pod \"novacell0a055-account-delete-qsx7n\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.494434 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.594438 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.651306 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-api-config-data: secret "nova-kuttl-api-config-data" not found Mar 11 19:14:36 crc kubenswrapper[4842]: E0311 19:14:36.651380 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data podName:bf3b5b7d-fb77-400c-97f2-f10c5064ffbd nodeName:}" failed. No retries permitted until 2026-03-11 19:14:37.651360974 +0000 UTC m=+1523.299057254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data") pod "nova-kuttl-api-0" (UID: "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd") : secret "nova-kuttl-api-config-data" not found Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.671111 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.688064 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.688516 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="9e19bc52-96e1-47ba-83de-cdad48efca4f" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://537a24a5bb0b6e011b4c285f435d493276c1151c3a90489dd7a5617c9dc3402f" gracePeriod=30 Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.702337 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-76jzf"] Mar 11 19:14:36 crc kubenswrapper[4842]: I0311 19:14:36.774038 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapi3968-account-delete-sr2nh"] Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.005716 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="042cdb69-b291-4c2d-a73b-e5ed47ee3760" path="/var/lib/kubelet/pods/042cdb69-b291-4c2d-a73b-e5ed47ee3760/volumes" Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.007429 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1135b7fc-e609-4d5c-8301-2606cf886c49" path="/var/lib/kubelet/pods/1135b7fc-e609-4d5c-8301-2606cf886c49/volumes" Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.008148 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e1d8bf-ceef-4519-affb-14fe1769a799" path="/var/lib/kubelet/pods/34e1d8bf-ceef-4519-affb-14fe1769a799/volumes" Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.013885 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452cba63-3d2d-41ae-a655-2f9b6cf932e9" path="/var/lib/kubelet/pods/452cba63-3d2d-41ae-a655-2f9b6cf932e9/volumes" Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.052577 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" event={"ID":"d04790e8-83cc-4d51-ab1c-f18e77eaa1de","Type":"ContainerStarted","Data":"d11a279a87598a498c13059180aa8205b1c81157f7a808b5b01de342fa5ce19f"} Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.080671 4842 generic.go:334] "Generic (PLEG): container finished" podID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerID="0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3" exitCode=143 Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.080907 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc","Type":"ContainerDied","Data":"0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3"} Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.081218 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-log" containerID="cri-o://826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5" gracePeriod=30 Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.081400 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-api" containerID="cri-o://2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745" gracePeriod=30 Mar 11 19:14:37 crc kubenswrapper[4842]: W0311 19:14:37.153173 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd9f18b6_7d89_488d_b047_c50abe0cfaf6.slice/crio-e9b3c974904a586ee51ea151178c67421c42821c4c7e6d8a9cc9f0a6a19f084b WatchSource:0}: Error finding container e9b3c974904a586ee51ea151178c67421c42821c4c7e6d8a9cc9f0a6a19f084b: Status 404 returned error can't find the container with id e9b3c974904a586ee51ea151178c67421c42821c4c7e6d8a9cc9f0a6a19f084b Mar 11 19:14:37 crc kubenswrapper[4842]: I0311 19:14:37.153916 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell0a055-account-delete-qsx7n"] Mar 11 19:14:37 crc kubenswrapper[4842]: E0311 19:14:37.670604 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-api-config-data: secret "nova-kuttl-api-config-data" not found Mar 11 19:14:37 crc kubenswrapper[4842]: E0311 19:14:37.671009 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data podName:bf3b5b7d-fb77-400c-97f2-f10c5064ffbd nodeName:}" failed. No retries permitted until 2026-03-11 19:14:39.670985097 +0000 UTC m=+1525.318681377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data") pod "nova-kuttl-api-0" (UID: "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd") : secret "nova-kuttl-api-config-data" not found Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.089419 4842 generic.go:334] "Generic (PLEG): container finished" podID="bd9f18b6-7d89-488d-b047-c50abe0cfaf6" containerID="0ffe9736c972d7f402106d5386a00c71fa67bef70d8cc297e74b639ceea37bda" exitCode=0 Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.089461 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" event={"ID":"bd9f18b6-7d89-488d-b047-c50abe0cfaf6","Type":"ContainerDied","Data":"0ffe9736c972d7f402106d5386a00c71fa67bef70d8cc297e74b639ceea37bda"} Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.089730 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" event={"ID":"bd9f18b6-7d89-488d-b047-c50abe0cfaf6","Type":"ContainerStarted","Data":"e9b3c974904a586ee51ea151178c67421c42821c4c7e6d8a9cc9f0a6a19f084b"} Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.091718 4842 generic.go:334] "Generic (PLEG): container finished" podID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerID="826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5" exitCode=143 Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.091814 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd","Type":"ContainerDied","Data":"826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5"} Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.093723 4842 generic.go:334] "Generic (PLEG): container finished" podID="d04790e8-83cc-4d51-ab1c-f18e77eaa1de" containerID="51711445d91b5b961f46016b717fdf0be2dde3b84c1b94b4bacf42d1eec37b57" exitCode=0 Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.093743 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" event={"ID":"d04790e8-83cc-4d51-ab1c-f18e77eaa1de","Type":"ContainerDied","Data":"51711445d91b5b961f46016b717fdf0be2dde3b84c1b94b4bacf42d1eec37b57"} Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.095377 4842 generic.go:334] "Generic (PLEG): container finished" podID="9e19bc52-96e1-47ba-83de-cdad48efca4f" containerID="537a24a5bb0b6e011b4c285f435d493276c1151c3a90489dd7a5617c9dc3402f" exitCode=0 Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.095404 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9e19bc52-96e1-47ba-83de-cdad48efca4f","Type":"ContainerDied","Data":"537a24a5bb0b6e011b4c285f435d493276c1151c3a90489dd7a5617c9dc3402f"} Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.319891 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.382766 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7bsr\" (UniqueName: \"kubernetes.io/projected/9e19bc52-96e1-47ba-83de-cdad48efca4f-kube-api-access-q7bsr\") pod \"9e19bc52-96e1-47ba-83de-cdad48efca4f\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.382855 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e19bc52-96e1-47ba-83de-cdad48efca4f-config-data\") pod \"9e19bc52-96e1-47ba-83de-cdad48efca4f\" (UID: \"9e19bc52-96e1-47ba-83de-cdad48efca4f\") " Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.392376 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e19bc52-96e1-47ba-83de-cdad48efca4f-kube-api-access-q7bsr" (OuterVolumeSpecName: "kube-api-access-q7bsr") pod "9e19bc52-96e1-47ba-83de-cdad48efca4f" (UID: "9e19bc52-96e1-47ba-83de-cdad48efca4f"). InnerVolumeSpecName "kube-api-access-q7bsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.417088 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e19bc52-96e1-47ba-83de-cdad48efca4f-config-data" (OuterVolumeSpecName: "config-data") pod "9e19bc52-96e1-47ba-83de-cdad48efca4f" (UID: "9e19bc52-96e1-47ba-83de-cdad48efca4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.487157 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7bsr\" (UniqueName: \"kubernetes.io/projected/9e19bc52-96e1-47ba-83de-cdad48efca4f-kube-api-access-q7bsr\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.487186 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e19bc52-96e1-47ba-83de-cdad48efca4f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.538477 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.588045 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17813a3-ec29-4465-8a6a-9fa8165b335c-config-data\") pod \"d17813a3-ec29-4465-8a6a-9fa8165b335c\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.588111 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqgp4\" (UniqueName: \"kubernetes.io/projected/d17813a3-ec29-4465-8a6a-9fa8165b335c-kube-api-access-tqgp4\") pod \"d17813a3-ec29-4465-8a6a-9fa8165b335c\" (UID: \"d17813a3-ec29-4465-8a6a-9fa8165b335c\") " Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.592784 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17813a3-ec29-4465-8a6a-9fa8165b335c-kube-api-access-tqgp4" (OuterVolumeSpecName: "kube-api-access-tqgp4") pod "d17813a3-ec29-4465-8a6a-9fa8165b335c" (UID: "d17813a3-ec29-4465-8a6a-9fa8165b335c"). InnerVolumeSpecName "kube-api-access-tqgp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.609320 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17813a3-ec29-4465-8a6a-9fa8165b335c-config-data" (OuterVolumeSpecName: "config-data") pod "d17813a3-ec29-4465-8a6a-9fa8165b335c" (UID: "d17813a3-ec29-4465-8a6a-9fa8165b335c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.689960 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d17813a3-ec29-4465-8a6a-9fa8165b335c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:38 crc kubenswrapper[4842]: I0311 19:14:38.690032 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqgp4\" (UniqueName: \"kubernetes.io/projected/d17813a3-ec29-4465-8a6a-9fa8165b335c-kube-api-access-tqgp4\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.104018 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9e19bc52-96e1-47ba-83de-cdad48efca4f","Type":"ContainerDied","Data":"e1463ea59eb2972732082efc7919fb73aca7f7f970788fdf7fc873f1e5cb52ab"} Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.104067 4842 scope.go:117] "RemoveContainer" containerID="537a24a5bb0b6e011b4c285f435d493276c1151c3a90489dd7a5617c9dc3402f" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.104067 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.106471 4842 generic.go:334] "Generic (PLEG): container finished" podID="d17813a3-ec29-4465-8a6a-9fa8165b335c" containerID="944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1" exitCode=0 Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.106521 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.106537 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d17813a3-ec29-4465-8a6a-9fa8165b335c","Type":"ContainerDied","Data":"944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1"} Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.106582 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"d17813a3-ec29-4465-8a6a-9fa8165b335c","Type":"ContainerDied","Data":"01b0fb3b8eb4fb6df4dc13c4c6728ff546f813f203643388ac09c56b45150d45"} Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.139688 4842 scope.go:117] "RemoveContainer" containerID="944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.144275 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.155062 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.163930 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.165918 4842 scope.go:117] "RemoveContainer" containerID="944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1" Mar 11 19:14:39 crc kubenswrapper[4842]: E0311 19:14:39.166713 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1\": container with ID starting with 944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1 not found: ID does not exist" containerID="944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.167082 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1"} err="failed to get container status \"944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1\": rpc error: code = NotFound desc = could not find container \"944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1\": container with ID starting with 944b52fb84abd76bfe39273a35926f48b1fcf2190505af9ffc6f7bd4ec569fe1 not found: ID does not exist" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.171625 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.467208 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.505403 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-operator-scripts\") pod \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.505529 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn72m\" (UniqueName: \"kubernetes.io/projected/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-kube-api-access-tn72m\") pod \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\" (UID: \"d04790e8-83cc-4d51-ab1c-f18e77eaa1de\") " Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.506508 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d04790e8-83cc-4d51-ab1c-f18e77eaa1de" (UID: "d04790e8-83cc-4d51-ab1c-f18e77eaa1de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.511206 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-kube-api-access-tn72m" (OuterVolumeSpecName: "kube-api-access-tn72m") pod "d04790e8-83cc-4d51-ab1c-f18e77eaa1de" (UID: "d04790e8-83cc-4d51-ab1c-f18e77eaa1de"). InnerVolumeSpecName "kube-api-access-tn72m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.557213 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.606559 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4ngc\" (UniqueName: \"kubernetes.io/projected/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-kube-api-access-l4ngc\") pod \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.606603 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-operator-scripts\") pod \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\" (UID: \"bd9f18b6-7d89-488d-b047-c50abe0cfaf6\") " Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.606921 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn72m\" (UniqueName: \"kubernetes.io/projected/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-kube-api-access-tn72m\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.606938 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d04790e8-83cc-4d51-ab1c-f18e77eaa1de-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.607327 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd9f18b6-7d89-488d-b047-c50abe0cfaf6" (UID: "bd9f18b6-7d89-488d-b047-c50abe0cfaf6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.610511 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-kube-api-access-l4ngc" (OuterVolumeSpecName: "kube-api-access-l4ngc") pod "bd9f18b6-7d89-488d-b047-c50abe0cfaf6" (UID: "bd9f18b6-7d89-488d-b047-c50abe0cfaf6"). InnerVolumeSpecName "kube-api-access-l4ngc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.708219 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4ngc\" (UniqueName: \"kubernetes.io/projected/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-kube-api-access-l4ngc\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.708251 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd9f18b6-7d89-488d-b047-c50abe0cfaf6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:39 crc kubenswrapper[4842]: E0311 19:14:39.708339 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-api-config-data: secret "nova-kuttl-api-config-data" not found Mar 11 19:14:39 crc kubenswrapper[4842]: E0311 19:14:39.708416 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data podName:bf3b5b7d-fb77-400c-97f2-f10c5064ffbd nodeName:}" failed. No retries permitted until 2026-03-11 19:14:43.708398073 +0000 UTC m=+1529.356094353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data") pod "nova-kuttl-api-0" (UID: "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd") : secret "nova-kuttl-api-config-data" not found Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.729856 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.808913 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-logs\") pod \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.809034 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-config-data\") pod \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.809060 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbrcq\" (UniqueName: \"kubernetes.io/projected/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-kube-api-access-rbrcq\") pod \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\" (UID: \"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc\") " Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.809326 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-logs" (OuterVolumeSpecName: "logs") pod "8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" (UID: "8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.809566 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.811901 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-kube-api-access-rbrcq" (OuterVolumeSpecName: "kube-api-access-rbrcq") pod "8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" (UID: "8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc"). InnerVolumeSpecName "kube-api-access-rbrcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.827455 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-config-data" (OuterVolumeSpecName: "config-data") pod "8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" (UID: "8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.910828 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:39 crc kubenswrapper[4842]: I0311 19:14:39.910857 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbrcq\" (UniqueName: \"kubernetes.io/projected/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc-kube-api-access-rbrcq\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.119749 4842 generic.go:334] "Generic (PLEG): container finished" podID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerID="209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1" exitCode=0 Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.119911 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc","Type":"ContainerDied","Data":"209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1"} Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.119956 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.119975 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc","Type":"ContainerDied","Data":"5d0a7458342c17eeae057ea54b6931414fdcf73b3b4e1a9b52b67af56bed5b88"} Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.119999 4842 scope.go:117] "RemoveContainer" containerID="209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.124600 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" event={"ID":"bd9f18b6-7d89-488d-b047-c50abe0cfaf6","Type":"ContainerDied","Data":"e9b3c974904a586ee51ea151178c67421c42821c4c7e6d8a9cc9f0a6a19f084b"} Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.124679 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9b3c974904a586ee51ea151178c67421c42821c4c7e6d8a9cc9f0a6a19f084b" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.124765 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0a055-account-delete-qsx7n" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.127078 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.127054 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapi3968-account-delete-sr2nh" event={"ID":"d04790e8-83cc-4d51-ab1c-f18e77eaa1de","Type":"ContainerDied","Data":"d11a279a87598a498c13059180aa8205b1c81157f7a808b5b01de342fa5ce19f"} Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.127180 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d11a279a87598a498c13059180aa8205b1c81157f7a808b5b01de342fa5ce19f" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.140637 4842 scope.go:117] "RemoveContainer" containerID="0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.163490 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.167022 4842 scope.go:117] "RemoveContainer" containerID="209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1" Mar 11 19:14:40 crc kubenswrapper[4842]: E0311 19:14:40.167497 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1\": container with ID starting with 209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1 not found: ID does not exist" containerID="209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.167539 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1"} err="failed to get container status \"209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1\": rpc error: code = NotFound desc = could not find container \"209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1\": container with ID starting with 209e05e7b275d262ee4bc31b4825f304f1065da7ef740c85d835e1d3ef5164f1 not found: ID does not exist" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.167569 4842 scope.go:117] "RemoveContainer" containerID="0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3" Mar 11 19:14:40 crc kubenswrapper[4842]: E0311 19:14:40.167905 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3\": container with ID starting with 0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3 not found: ID does not exist" containerID="0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.167966 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3"} err="failed to get container status \"0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3\": rpc error: code = NotFound desc = could not find container \"0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3\": container with ID starting with 0051baf066e9883a893c18e54dea8fc51be5ad399cc25796dd04276de117e2a3 not found: ID does not exist" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.171045 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.695094 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.728800 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-logs\") pod \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.728853 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65952\" (UniqueName: \"kubernetes.io/projected/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-kube-api-access-65952\") pod \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.729030 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data\") pod \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\" (UID: \"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd\") " Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.729322 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-logs" (OuterVolumeSpecName: "logs") pod "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" (UID: "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.730676 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.734826 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-kube-api-access-65952" (OuterVolumeSpecName: "kube-api-access-65952") pod "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" (UID: "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd"). InnerVolumeSpecName "kube-api-access-65952". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.751746 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data" (OuterVolumeSpecName: "config-data") pod "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" (UID: "bf3b5b7d-fb77-400c-97f2-f10c5064ffbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.831686 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.831724 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65952\" (UniqueName: \"kubernetes.io/projected/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd-kube-api-access-65952\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.971617 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" path="/var/lib/kubelet/pods/8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc/volumes" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.972172 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e19bc52-96e1-47ba-83de-cdad48efca4f" path="/var/lib/kubelet/pods/9e19bc52-96e1-47ba-83de-cdad48efca4f/volumes" Mar 11 19:14:40 crc kubenswrapper[4842]: I0311 19:14:40.972649 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17813a3-ec29-4465-8a6a-9fa8165b335c" path="/var/lib/kubelet/pods/d17813a3-ec29-4465-8a6a-9fa8165b335c/volumes" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.147696 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-db-create-p2t4f"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.150047 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd","Type":"ContainerDied","Data":"2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745"} Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.149666 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.149593 4842 generic.go:334] "Generic (PLEG): container finished" podID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerID="2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745" exitCode=0 Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.150465 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"bf3b5b7d-fb77-400c-97f2-f10c5064ffbd","Type":"ContainerDied","Data":"cb8205d05ddf3c6c91628b4b00a6ea5a405bf33e0b4994df2604b74f75cefff6"} Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.150265 4842 scope.go:117] "RemoveContainer" containerID="2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.169909 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-db-create-p2t4f"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.182839 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-3968-account-create-update-v22xt"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.185008 4842 scope.go:117] "RemoveContainer" containerID="826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.193388 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-3968-account-create-update-v22xt"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.203011 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.208887 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.213630 4842 scope.go:117] "RemoveContainer" containerID="2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745" Mar 11 19:14:41 crc kubenswrapper[4842]: E0311 19:14:41.215149 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745\": container with ID starting with 2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745 not found: ID does not exist" containerID="2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.215225 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745"} err="failed to get container status \"2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745\": rpc error: code = NotFound desc = could not find container \"2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745\": container with ID starting with 2c59908de0869059420dadb74470e71d4f290c91fb8d499c4cfe61815970e745 not found: ID does not exist" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.215253 4842 scope.go:117] "RemoveContainer" containerID="826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5" Mar 11 19:14:41 crc kubenswrapper[4842]: E0311 19:14:41.216584 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5\": container with ID starting with 826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5 not found: ID does not exist" containerID="826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.216637 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5"} err="failed to get container status \"826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5\": rpc error: code = NotFound desc = could not find container \"826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5\": container with ID starting with 826c999a3a3ba154ce142e00b68050e07810a65976ee1436ae5901e13ced60f5 not found: ID does not exist" Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.221195 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novaapi3968-account-delete-sr2nh"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.235972 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novaapi3968-account-delete-sr2nh"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.251436 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-m8r7v"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.259878 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-m8r7v"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.266619 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell0a055-account-delete-qsx7n"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.272516 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.277908 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell0a055-account-delete-qsx7n"] Mar 11 19:14:41 crc kubenswrapper[4842]: I0311 19:14:41.283246 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-a055-account-create-update-g92kl"] Mar 11 19:14:42 crc kubenswrapper[4842]: I0311 19:14:42.979192 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a909f9f5-c1ce-437f-a60c-3f5e73fd5f40" path="/var/lib/kubelet/pods/a909f9f5-c1ce-437f-a60c-3f5e73fd5f40/volumes" Mar 11 19:14:42 crc kubenswrapper[4842]: I0311 19:14:42.979920 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd9f18b6-7d89-488d-b047-c50abe0cfaf6" path="/var/lib/kubelet/pods/bd9f18b6-7d89-488d-b047-c50abe0cfaf6/volumes" Mar 11 19:14:42 crc kubenswrapper[4842]: I0311 19:14:42.980501 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" path="/var/lib/kubelet/pods/bf3b5b7d-fb77-400c-97f2-f10c5064ffbd/volumes" Mar 11 19:14:42 crc kubenswrapper[4842]: I0311 19:14:42.981495 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ee4193-bc4e-4684-8ce1-92c4db5864f2" path="/var/lib/kubelet/pods/c1ee4193-bc4e-4684-8ce1-92c4db5864f2/volumes" Mar 11 19:14:42 crc kubenswrapper[4842]: I0311 19:14:42.981992 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04790e8-83cc-4d51-ab1c-f18e77eaa1de" path="/var/lib/kubelet/pods/d04790e8-83cc-4d51-ab1c-f18e77eaa1de/volumes" Mar 11 19:14:42 crc kubenswrapper[4842]: I0311 19:14:42.982538 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0cc46d-a148-42b6-a184-8cbd5e5c14e4" path="/var/lib/kubelet/pods/da0cc46d-a148-42b6-a184-8cbd5e5c14e4/volumes" Mar 11 19:14:42 crc kubenswrapper[4842]: I0311 19:14:42.983162 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb473bb-9bf7-4bab-91e8-eef4e6931322" path="/var/lib/kubelet/pods/efb473bb-9bf7-4bab-91e8-eef4e6931322/volumes" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.587571 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-db-create-f2qc7"] Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.587938 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-metadata" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.587969 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-metadata" Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.587983 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd9f18b6-7d89-488d-b047-c50abe0cfaf6" containerName="mariadb-account-delete" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.587991 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd9f18b6-7d89-488d-b047-c50abe0cfaf6" containerName="mariadb-account-delete" Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.588010 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-log" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588022 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-log" Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.588035 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-log" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588044 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-log" Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.588064 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04790e8-83cc-4d51-ab1c-f18e77eaa1de" containerName="mariadb-account-delete" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588073 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04790e8-83cc-4d51-ab1c-f18e77eaa1de" containerName="mariadb-account-delete" Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.588091 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-api" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588101 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-api" Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.588129 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17813a3-ec29-4465-8a6a-9fa8165b335c" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588138 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17813a3-ec29-4465-8a6a-9fa8165b335c" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:14:43 crc kubenswrapper[4842]: E0311 19:14:43.588158 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e19bc52-96e1-47ba-83de-cdad48efca4f" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588166 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e19bc52-96e1-47ba-83de-cdad48efca4f" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588371 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17813a3-ec29-4465-8a6a-9fa8165b335c" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588400 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04790e8-83cc-4d51-ab1c-f18e77eaa1de" containerName="mariadb-account-delete" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588420 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-api" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588439 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-metadata" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588458 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e19bc52-96e1-47ba-83de-cdad48efca4f" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588476 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd9f18b6-7d89-488d-b047-c50abe0cfaf6" containerName="mariadb-account-delete" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588492 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0cdfa2-cc61-46cf-9a0f-4138c86f0ebc" containerName="nova-kuttl-metadata-log" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.588506 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3b5b7d-fb77-400c-97f2-f10c5064ffbd" containerName="nova-kuttl-api-log" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.589123 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.604302 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-f2qc7"] Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.683743 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d18db8f-38cf-408a-9a11-48fdc55fc29f-operator-scripts\") pod \"nova-api-db-create-f2qc7\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.683811 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxs4\" (UniqueName: \"kubernetes.io/projected/4d18db8f-38cf-408a-9a11-48fdc55fc29f-kube-api-access-7zxs4\") pod \"nova-api-db-create-f2qc7\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.691820 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-sf6dp"] Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.692972 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.702718 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-sf6dp"] Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.785136 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmpfd\" (UniqueName: \"kubernetes.io/projected/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-kube-api-access-qmpfd\") pod \"nova-cell0-db-create-sf6dp\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.785214 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d18db8f-38cf-408a-9a11-48fdc55fc29f-operator-scripts\") pod \"nova-api-db-create-f2qc7\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.785259 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxs4\" (UniqueName: \"kubernetes.io/projected/4d18db8f-38cf-408a-9a11-48fdc55fc29f-kube-api-access-7zxs4\") pod \"nova-api-db-create-f2qc7\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.785333 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-operator-scripts\") pod \"nova-cell0-db-create-sf6dp\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.785996 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d18db8f-38cf-408a-9a11-48fdc55fc29f-operator-scripts\") pod \"nova-api-db-create-f2qc7\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.808398 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv"] Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.808824 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxs4\" (UniqueName: \"kubernetes.io/projected/4d18db8f-38cf-408a-9a11-48fdc55fc29f-kube-api-access-7zxs4\") pod \"nova-api-db-create-f2qc7\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.809501 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.812653 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-api-db-secret" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.820737 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv"] Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.890753 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z2p\" (UniqueName: \"kubernetes.io/projected/fab2ee5c-61e7-4313-af6f-8e6df74b134d-kube-api-access-44z2p\") pod \"nova-api-9f5a-account-create-update-gb4rv\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.892079 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-operator-scripts\") pod \"nova-cell0-db-create-sf6dp\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.892765 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fab2ee5c-61e7-4313-af6f-8e6df74b134d-operator-scripts\") pod \"nova-api-9f5a-account-create-update-gb4rv\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.892881 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmpfd\" (UniqueName: \"kubernetes.io/projected/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-kube-api-access-qmpfd\") pod \"nova-cell0-db-create-sf6dp\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.894910 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-operator-scripts\") pod \"nova-cell0-db-create-sf6dp\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.903596 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-552pr"] Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.904861 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.919958 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmpfd\" (UniqueName: \"kubernetes.io/projected/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-kube-api-access-qmpfd\") pod \"nova-cell0-db-create-sf6dp\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.924166 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.924949 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-552pr"] Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.994743 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z2p\" (UniqueName: \"kubernetes.io/projected/fab2ee5c-61e7-4313-af6f-8e6df74b134d-kube-api-access-44z2p\") pod \"nova-api-9f5a-account-create-update-gb4rv\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.994867 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcnlj\" (UniqueName: \"kubernetes.io/projected/929e25d9-24c3-457b-b067-f925aa4326ac-kube-api-access-rcnlj\") pod \"nova-cell1-db-create-552pr\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.994895 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/929e25d9-24c3-457b-b067-f925aa4326ac-operator-scripts\") pod \"nova-cell1-db-create-552pr\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.994922 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fab2ee5c-61e7-4313-af6f-8e6df74b134d-operator-scripts\") pod \"nova-api-9f5a-account-create-update-gb4rv\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:43 crc kubenswrapper[4842]: I0311 19:14:43.996025 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fab2ee5c-61e7-4313-af6f-8e6df74b134d-operator-scripts\") pod \"nova-api-9f5a-account-create-update-gb4rv\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.019642 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z2p\" (UniqueName: \"kubernetes.io/projected/fab2ee5c-61e7-4313-af6f-8e6df74b134d-kube-api-access-44z2p\") pod \"nova-api-9f5a-account-create-update-gb4rv\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.032756 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq"] Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.033838 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.036380 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell0-db-secret" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.048265 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq"] Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.065583 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.115024 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdv5\" (UniqueName: \"kubernetes.io/projected/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-kube-api-access-krdv5\") pod \"nova-cell0-f700-account-create-update-w7qgq\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.115117 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-operator-scripts\") pod \"nova-cell0-f700-account-create-update-w7qgq\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.115176 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcnlj\" (UniqueName: \"kubernetes.io/projected/929e25d9-24c3-457b-b067-f925aa4326ac-kube-api-access-rcnlj\") pod \"nova-cell1-db-create-552pr\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.115206 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/929e25d9-24c3-457b-b067-f925aa4326ac-operator-scripts\") pod \"nova-cell1-db-create-552pr\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.118133 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/929e25d9-24c3-457b-b067-f925aa4326ac-operator-scripts\") pod \"nova-cell1-db-create-552pr\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.137411 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcnlj\" (UniqueName: \"kubernetes.io/projected/929e25d9-24c3-457b-b067-f925aa4326ac-kube-api-access-rcnlj\") pod \"nova-cell1-db-create-552pr\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.152039 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.216305 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krdv5\" (UniqueName: \"kubernetes.io/projected/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-kube-api-access-krdv5\") pod \"nova-cell0-f700-account-create-update-w7qgq\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.216390 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-operator-scripts\") pod \"nova-cell0-f700-account-create-update-w7qgq\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.217089 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-operator-scripts\") pod \"nova-cell0-f700-account-create-update-w7qgq\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.222749 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq"] Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.224731 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.228593 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell1-db-secret" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.231096 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq"] Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.239657 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdv5\" (UniqueName: \"kubernetes.io/projected/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-kube-api-access-krdv5\") pod \"nova-cell0-f700-account-create-update-w7qgq\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.259150 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.319804 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385d679-2159-40b8-afae-681623c5faac-operator-scripts\") pod \"nova-cell1-f473-account-create-update-fsrrq\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.319948 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4djst\" (UniqueName: \"kubernetes.io/projected/2385d679-2159-40b8-afae-681623c5faac-kube-api-access-4djst\") pod \"nova-cell1-f473-account-create-update-fsrrq\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.422070 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385d679-2159-40b8-afae-681623c5faac-operator-scripts\") pod \"nova-cell1-f473-account-create-update-fsrrq\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.422211 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4djst\" (UniqueName: \"kubernetes.io/projected/2385d679-2159-40b8-afae-681623c5faac-kube-api-access-4djst\") pod \"nova-cell1-f473-account-create-update-fsrrq\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.423757 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385d679-2159-40b8-afae-681623c5faac-operator-scripts\") pod \"nova-cell1-f473-account-create-update-fsrrq\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.425805 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.464952 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4djst\" (UniqueName: \"kubernetes.io/projected/2385d679-2159-40b8-afae-681623c5faac-kube-api-access-4djst\") pod \"nova-cell1-f473-account-create-update-fsrrq\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.492355 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-f2qc7"] Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.551718 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.655173 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv"] Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.689839 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-sf6dp"] Mar 11 19:14:44 crc kubenswrapper[4842]: I0311 19:14:44.926807 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-552pr"] Mar 11 19:14:44 crc kubenswrapper[4842]: W0311 19:14:44.934776 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod929e25d9_24c3_457b_b067_f925aa4326ac.slice/crio-de422dc95e08f5a464f6b589da0a7fd2a24e84903002b6b51ee07522e5fc8591 WatchSource:0}: Error finding container de422dc95e08f5a464f6b589da0a7fd2a24e84903002b6b51ee07522e5fc8591: Status 404 returned error can't find the container with id de422dc95e08f5a464f6b589da0a7fd2a24e84903002b6b51ee07522e5fc8591 Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.050983 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq"] Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.133957 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq"] Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.192565 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" event={"ID":"fab2ee5c-61e7-4313-af6f-8e6df74b134d","Type":"ContainerStarted","Data":"1e28125473f9ef9d72fdbfd26586f58ebeb207de186e4740d00cac9083c3c4d6"} Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.196410 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" event={"ID":"2385d679-2159-40b8-afae-681623c5faac","Type":"ContainerStarted","Data":"856b0fbd3ee4472f481b1cf559667ee37a97b407f59b645d726987f416ad7628"} Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.198047 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" event={"ID":"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf","Type":"ContainerStarted","Data":"1548ecb400bdd56b127a2b5eb8ed8ba28027b8860a20b861e652d8ef89a2a1a3"} Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.199742 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" event={"ID":"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa","Type":"ContainerStarted","Data":"0ad53f98e6a5e341904d726588deb263b174facc44669aa7adc6e476f1c4fbfd"} Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.201092 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-552pr" event={"ID":"929e25d9-24c3-457b-b067-f925aa4326ac","Type":"ContainerStarted","Data":"de422dc95e08f5a464f6b589da0a7fd2a24e84903002b6b51ee07522e5fc8591"} Mar 11 19:14:45 crc kubenswrapper[4842]: I0311 19:14:45.202371 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-f2qc7" event={"ID":"4d18db8f-38cf-408a-9a11-48fdc55fc29f","Type":"ContainerStarted","Data":"e2b2c9b4250162eb8aa8ddf30d9ba6b3dc7833f9ae3681387f59f998ba27eebf"} Mar 11 19:14:46 crc kubenswrapper[4842]: I0311 19:14:46.216183 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-f2qc7" event={"ID":"4d18db8f-38cf-408a-9a11-48fdc55fc29f","Type":"ContainerStarted","Data":"2ccd974ce721f039dbf24694518bc042cdca375b64ef14687b821187c50c178c"} Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.227609 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" event={"ID":"fab2ee5c-61e7-4313-af6f-8e6df74b134d","Type":"ContainerStarted","Data":"ed8c4dc680cfc8295bd431eb3cacc2435203d6b35b9916942752ad00dbc9996d"} Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.229042 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" event={"ID":"2385d679-2159-40b8-afae-681623c5faac","Type":"ContainerStarted","Data":"ea4c54b26c9ad7259977dddef7f0094084dd6dc1317bfef6bbc8086968940002"} Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.230362 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" event={"ID":"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf","Type":"ContainerStarted","Data":"f1a58b8d1fd964a5881734d0dc88d1f8301bcc4d29bf3251334dfee0e923a2ee"} Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.231415 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" event={"ID":"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa","Type":"ContainerStarted","Data":"764686a44630d563dbf64bed8e7376038bbc92363e11f7ed0ab264ddde643bfc"} Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.233341 4842 generic.go:334] "Generic (PLEG): container finished" podID="929e25d9-24c3-457b-b067-f925aa4326ac" containerID="52edfb9d8d37e047f5d0d37b453e2827b926eab4c1c4b4046c14ed42e68bdf82" exitCode=0 Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.233895 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-552pr" event={"ID":"929e25d9-24c3-457b-b067-f925aa4326ac","Type":"ContainerDied","Data":"52edfb9d8d37e047f5d0d37b453e2827b926eab4c1c4b4046c14ed42e68bdf82"} Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.249788 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" podStartSLOduration=4.2497686980000005 podStartE2EDuration="4.249768698s" podCreationTimestamp="2026-03-11 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:47.243947325 +0000 UTC m=+1532.891643625" watchObservedRunningTime="2026-03-11 19:14:47.249768698 +0000 UTC m=+1532.897464978" Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.279523 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" podStartSLOduration=3.27950707 podStartE2EDuration="3.27950707s" podCreationTimestamp="2026-03-11 19:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:47.274892409 +0000 UTC m=+1532.922588709" watchObservedRunningTime="2026-03-11 19:14:47.27950707 +0000 UTC m=+1532.927203340" Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.321796 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" podStartSLOduration=3.321768162 podStartE2EDuration="3.321768162s" podCreationTimestamp="2026-03-11 19:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:47.29395871 +0000 UTC m=+1532.941655040" watchObservedRunningTime="2026-03-11 19:14:47.321768162 +0000 UTC m=+1532.969464472" Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.329258 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-api-db-create-f2qc7" podStartSLOduration=4.329235478 podStartE2EDuration="4.329235478s" podCreationTimestamp="2026-03-11 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:47.310531556 +0000 UTC m=+1532.958227876" watchObservedRunningTime="2026-03-11 19:14:47.329235478 +0000 UTC m=+1532.976931758" Mar 11 19:14:47 crc kubenswrapper[4842]: I0311 19:14:47.333340 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" podStartSLOduration=4.333332016 podStartE2EDuration="4.333332016s" podCreationTimestamp="2026-03-11 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:47.32475691 +0000 UTC m=+1532.972453200" watchObservedRunningTime="2026-03-11 19:14:47.333332016 +0000 UTC m=+1532.981028296" Mar 11 19:14:47 crc kubenswrapper[4842]: E0311 19:14:47.351562 4842 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d18db8f_38cf_408a_9a11_48fdc55fc29f.slice/crio-2ccd974ce721f039dbf24694518bc042cdca375b64ef14687b821187c50c178c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d18db8f_38cf_408a_9a11_48fdc55fc29f.slice/crio-conmon-2ccd974ce721f039dbf24694518bc042cdca375b64ef14687b821187c50c178c.scope\": RecentStats: unable to find data in memory cache]" Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.246088 4842 generic.go:334] "Generic (PLEG): container finished" podID="cefa47e9-8b49-4b8f-a48a-d41d73fd62aa" containerID="764686a44630d563dbf64bed8e7376038bbc92363e11f7ed0ab264ddde643bfc" exitCode=0 Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.246644 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" event={"ID":"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa","Type":"ContainerDied","Data":"764686a44630d563dbf64bed8e7376038bbc92363e11f7ed0ab264ddde643bfc"} Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.250552 4842 generic.go:334] "Generic (PLEG): container finished" podID="4d18db8f-38cf-408a-9a11-48fdc55fc29f" containerID="2ccd974ce721f039dbf24694518bc042cdca375b64ef14687b821187c50c178c" exitCode=0 Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.250631 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-f2qc7" event={"ID":"4d18db8f-38cf-408a-9a11-48fdc55fc29f","Type":"ContainerDied","Data":"2ccd974ce721f039dbf24694518bc042cdca375b64ef14687b821187c50c178c"} Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.252626 4842 generic.go:334] "Generic (PLEG): container finished" podID="2385d679-2159-40b8-afae-681623c5faac" containerID="ea4c54b26c9ad7259977dddef7f0094084dd6dc1317bfef6bbc8086968940002" exitCode=0 Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.252734 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" event={"ID":"2385d679-2159-40b8-afae-681623c5faac","Type":"ContainerDied","Data":"ea4c54b26c9ad7259977dddef7f0094084dd6dc1317bfef6bbc8086968940002"} Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.586123 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.598857 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/929e25d9-24c3-457b-b067-f925aa4326ac-operator-scripts\") pod \"929e25d9-24c3-457b-b067-f925aa4326ac\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.598975 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcnlj\" (UniqueName: \"kubernetes.io/projected/929e25d9-24c3-457b-b067-f925aa4326ac-kube-api-access-rcnlj\") pod \"929e25d9-24c3-457b-b067-f925aa4326ac\" (UID: \"929e25d9-24c3-457b-b067-f925aa4326ac\") " Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.599458 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/929e25d9-24c3-457b-b067-f925aa4326ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "929e25d9-24c3-457b-b067-f925aa4326ac" (UID: "929e25d9-24c3-457b-b067-f925aa4326ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.606495 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/929e25d9-24c3-457b-b067-f925aa4326ac-kube-api-access-rcnlj" (OuterVolumeSpecName: "kube-api-access-rcnlj") pod "929e25d9-24c3-457b-b067-f925aa4326ac" (UID: "929e25d9-24c3-457b-b067-f925aa4326ac"). InnerVolumeSpecName "kube-api-access-rcnlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.700962 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/929e25d9-24c3-457b-b067-f925aa4326ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:48 crc kubenswrapper[4842]: I0311 19:14:48.701002 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcnlj\" (UniqueName: \"kubernetes.io/projected/929e25d9-24c3-457b-b067-f925aa4326ac-kube-api-access-rcnlj\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.262697 4842 generic.go:334] "Generic (PLEG): container finished" podID="fab2ee5c-61e7-4313-af6f-8e6df74b134d" containerID="ed8c4dc680cfc8295bd431eb3cacc2435203d6b35b9916942752ad00dbc9996d" exitCode=0 Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.262759 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" event={"ID":"fab2ee5c-61e7-4313-af6f-8e6df74b134d","Type":"ContainerDied","Data":"ed8c4dc680cfc8295bd431eb3cacc2435203d6b35b9916942752ad00dbc9996d"} Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.264228 4842 generic.go:334] "Generic (PLEG): container finished" podID="60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf" containerID="f1a58b8d1fd964a5881734d0dc88d1f8301bcc4d29bf3251334dfee0e923a2ee" exitCode=0 Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.264295 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" event={"ID":"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf","Type":"ContainerDied","Data":"f1a58b8d1fd964a5881734d0dc88d1f8301bcc4d29bf3251334dfee0e923a2ee"} Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.265634 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-552pr" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.266398 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-552pr" event={"ID":"929e25d9-24c3-457b-b067-f925aa4326ac","Type":"ContainerDied","Data":"de422dc95e08f5a464f6b589da0a7fd2a24e84903002b6b51ee07522e5fc8591"} Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.266423 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de422dc95e08f5a464f6b589da0a7fd2a24e84903002b6b51ee07522e5fc8591" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.652165 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.715325 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d18db8f-38cf-408a-9a11-48fdc55fc29f-operator-scripts\") pod \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.715374 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zxs4\" (UniqueName: \"kubernetes.io/projected/4d18db8f-38cf-408a-9a11-48fdc55fc29f-kube-api-access-7zxs4\") pod \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\" (UID: \"4d18db8f-38cf-408a-9a11-48fdc55fc29f\") " Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.716162 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d18db8f-38cf-408a-9a11-48fdc55fc29f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d18db8f-38cf-408a-9a11-48fdc55fc29f" (UID: "4d18db8f-38cf-408a-9a11-48fdc55fc29f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.719484 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d18db8f-38cf-408a-9a11-48fdc55fc29f-kube-api-access-7zxs4" (OuterVolumeSpecName: "kube-api-access-7zxs4") pod "4d18db8f-38cf-408a-9a11-48fdc55fc29f" (UID: "4d18db8f-38cf-408a-9a11-48fdc55fc29f"). InnerVolumeSpecName "kube-api-access-7zxs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.753716 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.758949 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.816246 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385d679-2159-40b8-afae-681623c5faac-operator-scripts\") pod \"2385d679-2159-40b8-afae-681623c5faac\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.816315 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-operator-scripts\") pod \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.816354 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4djst\" (UniqueName: \"kubernetes.io/projected/2385d679-2159-40b8-afae-681623c5faac-kube-api-access-4djst\") pod \"2385d679-2159-40b8-afae-681623c5faac\" (UID: \"2385d679-2159-40b8-afae-681623c5faac\") " Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.816373 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmpfd\" (UniqueName: \"kubernetes.io/projected/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-kube-api-access-qmpfd\") pod \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\" (UID: \"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa\") " Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.816626 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d18db8f-38cf-408a-9a11-48fdc55fc29f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.816639 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zxs4\" (UniqueName: \"kubernetes.io/projected/4d18db8f-38cf-408a-9a11-48fdc55fc29f-kube-api-access-7zxs4\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.816876 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cefa47e9-8b49-4b8f-a48a-d41d73fd62aa" (UID: "cefa47e9-8b49-4b8f-a48a-d41d73fd62aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.817139 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2385d679-2159-40b8-afae-681623c5faac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2385d679-2159-40b8-afae-681623c5faac" (UID: "2385d679-2159-40b8-afae-681623c5faac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.819124 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-kube-api-access-qmpfd" (OuterVolumeSpecName: "kube-api-access-qmpfd") pod "cefa47e9-8b49-4b8f-a48a-d41d73fd62aa" (UID: "cefa47e9-8b49-4b8f-a48a-d41d73fd62aa"). InnerVolumeSpecName "kube-api-access-qmpfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.819375 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2385d679-2159-40b8-afae-681623c5faac-kube-api-access-4djst" (OuterVolumeSpecName: "kube-api-access-4djst") pod "2385d679-2159-40b8-afae-681623c5faac" (UID: "2385d679-2159-40b8-afae-681623c5faac"). InnerVolumeSpecName "kube-api-access-4djst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.917373 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4djst\" (UniqueName: \"kubernetes.io/projected/2385d679-2159-40b8-afae-681623c5faac-kube-api-access-4djst\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.917406 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmpfd\" (UniqueName: \"kubernetes.io/projected/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-kube-api-access-qmpfd\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.917419 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385d679-2159-40b8-afae-681623c5faac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:49 crc kubenswrapper[4842]: I0311 19:14:49.917428 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.273517 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" event={"ID":"cefa47e9-8b49-4b8f-a48a-d41d73fd62aa","Type":"ContainerDied","Data":"0ad53f98e6a5e341904d726588deb263b174facc44669aa7adc6e476f1c4fbfd"} Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.273574 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad53f98e6a5e341904d726588deb263b174facc44669aa7adc6e476f1c4fbfd" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.273589 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-sf6dp" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.275202 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-f2qc7" event={"ID":"4d18db8f-38cf-408a-9a11-48fdc55fc29f","Type":"ContainerDied","Data":"e2b2c9b4250162eb8aa8ddf30d9ba6b3dc7833f9ae3681387f59f998ba27eebf"} Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.275243 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2b2c9b4250162eb8aa8ddf30d9ba6b3dc7833f9ae3681387f59f998ba27eebf" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.275256 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-f2qc7" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.277238 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" event={"ID":"2385d679-2159-40b8-afae-681623c5faac","Type":"ContainerDied","Data":"856b0fbd3ee4472f481b1cf559667ee37a97b407f59b645d726987f416ad7628"} Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.277252 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.277260 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856b0fbd3ee4472f481b1cf559667ee37a97b407f59b645d726987f416ad7628" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.642278 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.650890 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.727713 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44z2p\" (UniqueName: \"kubernetes.io/projected/fab2ee5c-61e7-4313-af6f-8e6df74b134d-kube-api-access-44z2p\") pod \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.727813 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krdv5\" (UniqueName: \"kubernetes.io/projected/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-kube-api-access-krdv5\") pod \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.727862 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fab2ee5c-61e7-4313-af6f-8e6df74b134d-operator-scripts\") pod \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\" (UID: \"fab2ee5c-61e7-4313-af6f-8e6df74b134d\") " Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.727881 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-operator-scripts\") pod \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\" (UID: \"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf\") " Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.728423 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf" (UID: "60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.728431 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab2ee5c-61e7-4313-af6f-8e6df74b134d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fab2ee5c-61e7-4313-af6f-8e6df74b134d" (UID: "fab2ee5c-61e7-4313-af6f-8e6df74b134d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.731397 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-kube-api-access-krdv5" (OuterVolumeSpecName: "kube-api-access-krdv5") pod "60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf" (UID: "60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf"). InnerVolumeSpecName "kube-api-access-krdv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.732401 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab2ee5c-61e7-4313-af6f-8e6df74b134d-kube-api-access-44z2p" (OuterVolumeSpecName: "kube-api-access-44z2p") pod "fab2ee5c-61e7-4313-af6f-8e6df74b134d" (UID: "fab2ee5c-61e7-4313-af6f-8e6df74b134d"). InnerVolumeSpecName "kube-api-access-44z2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.829409 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krdv5\" (UniqueName: \"kubernetes.io/projected/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-kube-api-access-krdv5\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.829442 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fab2ee5c-61e7-4313-af6f-8e6df74b134d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.829452 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:50 crc kubenswrapper[4842]: I0311 19:14:50.829462 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44z2p\" (UniqueName: \"kubernetes.io/projected/fab2ee5c-61e7-4313-af6f-8e6df74b134d-kube-api-access-44z2p\") on node \"crc\" DevicePath \"\"" Mar 11 19:14:51 crc kubenswrapper[4842]: I0311 19:14:51.285472 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" event={"ID":"fab2ee5c-61e7-4313-af6f-8e6df74b134d","Type":"ContainerDied","Data":"1e28125473f9ef9d72fdbfd26586f58ebeb207de186e4740d00cac9083c3c4d6"} Mar 11 19:14:51 crc kubenswrapper[4842]: I0311 19:14:51.285845 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e28125473f9ef9d72fdbfd26586f58ebeb207de186e4740d00cac9083c3c4d6" Mar 11 19:14:51 crc kubenswrapper[4842]: I0311 19:14:51.285589 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv" Mar 11 19:14:51 crc kubenswrapper[4842]: I0311 19:14:51.288237 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" event={"ID":"60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf","Type":"ContainerDied","Data":"1548ecb400bdd56b127a2b5eb8ed8ba28027b8860a20b861e652d8ef89a2a1a3"} Mar 11 19:14:51 crc kubenswrapper[4842]: I0311 19:14:51.288291 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1548ecb400bdd56b127a2b5eb8ed8ba28027b8860a20b861e652d8ef89a2a1a3" Mar 11 19:14:51 crc kubenswrapper[4842]: I0311 19:14:51.288339 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.328635 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5"] Mar 11 19:14:54 crc kubenswrapper[4842]: E0311 19:14:54.328929 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d18db8f-38cf-408a-9a11-48fdc55fc29f" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329126 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d18db8f-38cf-408a-9a11-48fdc55fc29f" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: E0311 19:14:54.329136 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329142 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: E0311 19:14:54.329149 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cefa47e9-8b49-4b8f-a48a-d41d73fd62aa" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329156 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="cefa47e9-8b49-4b8f-a48a-d41d73fd62aa" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: E0311 19:14:54.329174 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab2ee5c-61e7-4313-af6f-8e6df74b134d" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329181 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab2ee5c-61e7-4313-af6f-8e6df74b134d" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: E0311 19:14:54.329198 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929e25d9-24c3-457b-b067-f925aa4326ac" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329203 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="929e25d9-24c3-457b-b067-f925aa4326ac" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: E0311 19:14:54.329215 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2385d679-2159-40b8-afae-681623c5faac" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329221 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="2385d679-2159-40b8-afae-681623c5faac" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329360 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="2385d679-2159-40b8-afae-681623c5faac" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329375 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="cefa47e9-8b49-4b8f-a48a-d41d73fd62aa" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329384 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab2ee5c-61e7-4313-af6f-8e6df74b134d" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329391 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="929e25d9-24c3-457b-b067-f925aa4326ac" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329402 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d18db8f-38cf-408a-9a11-48fdc55fc29f" containerName="mariadb-database-create" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329410 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf" containerName="mariadb-account-create-update" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.329897 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.332355 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-scripts" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.332355 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-zdjxx" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.334790 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.343365 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5"] Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.487504 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.487544 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99mm6\" (UniqueName: \"kubernetes.io/projected/11c61366-3a2d-4208-aa39-949370bd3232-kube-api-access-99mm6\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.487576 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.588685 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.588730 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99mm6\" (UniqueName: \"kubernetes.io/projected/11c61366-3a2d-4208-aa39-949370bd3232-kube-api-access-99mm6\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.588764 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.593650 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.596410 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.604887 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99mm6\" (UniqueName: \"kubernetes.io/projected/11c61366-3a2d-4208-aa39-949370bd3232-kube-api-access-99mm6\") pod \"nova-kuttl-cell0-conductor-db-sync-m94z5\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:54 crc kubenswrapper[4842]: I0311 19:14:54.649471 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:14:55 crc kubenswrapper[4842]: I0311 19:14:55.057785 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5"] Mar 11 19:14:55 crc kubenswrapper[4842]: I0311 19:14:55.333110 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" event={"ID":"11c61366-3a2d-4208-aa39-949370bd3232","Type":"ContainerStarted","Data":"9b0c6736233c151848d04334b1e4be1d1e78c89c2e64a02c249766767e00286d"} Mar 11 19:14:55 crc kubenswrapper[4842]: I0311 19:14:55.333192 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" event={"ID":"11c61366-3a2d-4208-aa39-949370bd3232","Type":"ContainerStarted","Data":"9573e4201c04e04e84fb12b1549b5caf40e9c1e39a6ed4255495bf998abad09f"} Mar 11 19:14:55 crc kubenswrapper[4842]: I0311 19:14:55.357470 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" podStartSLOduration=1.357439687 podStartE2EDuration="1.357439687s" podCreationTimestamp="2026-03-11 19:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:14:55.357062237 +0000 UTC m=+1541.004758517" watchObservedRunningTime="2026-03-11 19:14:55.357439687 +0000 UTC m=+1541.005135977" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.137859 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4"] Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.139447 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.145477 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.146254 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.157378 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4"] Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.189741 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv4bh\" (UniqueName: \"kubernetes.io/projected/8bfb88cd-421b-4657-be01-65063e13a247-kube-api-access-cv4bh\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.189880 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bfb88cd-421b-4657-be01-65063e13a247-secret-volume\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.189910 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bfb88cd-421b-4657-be01-65063e13a247-config-volume\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.291821 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv4bh\" (UniqueName: \"kubernetes.io/projected/8bfb88cd-421b-4657-be01-65063e13a247-kube-api-access-cv4bh\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.291966 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bfb88cd-421b-4657-be01-65063e13a247-secret-volume\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.292007 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bfb88cd-421b-4657-be01-65063e13a247-config-volume\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.293051 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bfb88cd-421b-4657-be01-65063e13a247-config-volume\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.302905 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bfb88cd-421b-4657-be01-65063e13a247-secret-volume\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.315377 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv4bh\" (UniqueName: \"kubernetes.io/projected/8bfb88cd-421b-4657-be01-65063e13a247-kube-api-access-cv4bh\") pod \"collect-profiles-29554275-bc8w4\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.384100 4842 generic.go:334] "Generic (PLEG): container finished" podID="11c61366-3a2d-4208-aa39-949370bd3232" containerID="9b0c6736233c151848d04334b1e4be1d1e78c89c2e64a02c249766767e00286d" exitCode=0 Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.384155 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" event={"ID":"11c61366-3a2d-4208-aa39-949370bd3232","Type":"ContainerDied","Data":"9b0c6736233c151848d04334b1e4be1d1e78c89c2e64a02c249766767e00286d"} Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.455589 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:00 crc kubenswrapper[4842]: W0311 19:15:00.990453 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bfb88cd_421b_4657_be01_65063e13a247.slice/crio-625dd5adac575ac702f4aeaad3c9f7a0edb8662bc4ccef29363d0d4dbef2f8fa WatchSource:0}: Error finding container 625dd5adac575ac702f4aeaad3c9f7a0edb8662bc4ccef29363d0d4dbef2f8fa: Status 404 returned error can't find the container with id 625dd5adac575ac702f4aeaad3c9f7a0edb8662bc4ccef29363d0d4dbef2f8fa Mar 11 19:15:00 crc kubenswrapper[4842]: I0311 19:15:00.991029 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4"] Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.393063 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" event={"ID":"8bfb88cd-421b-4657-be01-65063e13a247","Type":"ContainerStarted","Data":"b26c5e68adcdebe6a6a2826411e3a9f8d9454d7f6537fe647261297b720055c0"} Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.393363 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" event={"ID":"8bfb88cd-421b-4657-be01-65063e13a247","Type":"ContainerStarted","Data":"625dd5adac575ac702f4aeaad3c9f7a0edb8662bc4ccef29363d0d4dbef2f8fa"} Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.424017 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" podStartSLOduration=1.423995565 podStartE2EDuration="1.423995565s" podCreationTimestamp="2026-03-11 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:01.414410103 +0000 UTC m=+1547.062106393" watchObservedRunningTime="2026-03-11 19:15:01.423995565 +0000 UTC m=+1547.071691845" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.472030 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.472347 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.472451 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.473155 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.473297 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" gracePeriod=600 Mar 11 19:15:01 crc kubenswrapper[4842]: E0311 19:15:01.623295 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.726941 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.850527 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99mm6\" (UniqueName: \"kubernetes.io/projected/11c61366-3a2d-4208-aa39-949370bd3232-kube-api-access-99mm6\") pod \"11c61366-3a2d-4208-aa39-949370bd3232\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.851697 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-scripts\") pod \"11c61366-3a2d-4208-aa39-949370bd3232\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.851734 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-config-data\") pod \"11c61366-3a2d-4208-aa39-949370bd3232\" (UID: \"11c61366-3a2d-4208-aa39-949370bd3232\") " Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.856458 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-scripts" (OuterVolumeSpecName: "scripts") pod "11c61366-3a2d-4208-aa39-949370bd3232" (UID: "11c61366-3a2d-4208-aa39-949370bd3232"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.857198 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c61366-3a2d-4208-aa39-949370bd3232-kube-api-access-99mm6" (OuterVolumeSpecName: "kube-api-access-99mm6") pod "11c61366-3a2d-4208-aa39-949370bd3232" (UID: "11c61366-3a2d-4208-aa39-949370bd3232"). InnerVolumeSpecName "kube-api-access-99mm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.878711 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-config-data" (OuterVolumeSpecName: "config-data") pod "11c61366-3a2d-4208-aa39-949370bd3232" (UID: "11c61366-3a2d-4208-aa39-949370bd3232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.954399 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99mm6\" (UniqueName: \"kubernetes.io/projected/11c61366-3a2d-4208-aa39-949370bd3232-kube-api-access-99mm6\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.954487 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:01 crc kubenswrapper[4842]: I0311 19:15:01.954500 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c61366-3a2d-4208-aa39-949370bd3232-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.402255 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" event={"ID":"11c61366-3a2d-4208-aa39-949370bd3232","Type":"ContainerDied","Data":"9573e4201c04e04e84fb12b1549b5caf40e9c1e39a6ed4255495bf998abad09f"} Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.402322 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9573e4201c04e04e84fb12b1549b5caf40e9c1e39a6ed4255495bf998abad09f" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.402332 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.404801 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" exitCode=0 Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.404871 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3"} Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.405082 4842 scope.go:117] "RemoveContainer" containerID="0e0132978f744075878dba0b5cc46ab6911da7e9e6a8e99f3a4db40255e33bd4" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.405685 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:15:02 crc kubenswrapper[4842]: E0311 19:15:02.405905 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.407131 4842 generic.go:334] "Generic (PLEG): container finished" podID="8bfb88cd-421b-4657-be01-65063e13a247" containerID="b26c5e68adcdebe6a6a2826411e3a9f8d9454d7f6537fe647261297b720055c0" exitCode=0 Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.407162 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" event={"ID":"8bfb88cd-421b-4657-be01-65063e13a247","Type":"ContainerDied","Data":"b26c5e68adcdebe6a6a2826411e3a9f8d9454d7f6537fe647261297b720055c0"} Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.503961 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:15:02 crc kubenswrapper[4842]: E0311 19:15:02.504346 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c61366-3a2d-4208-aa39-949370bd3232" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.504365 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c61366-3a2d-4208-aa39-949370bd3232" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.504549 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c61366-3a2d-4208-aa39-949370bd3232" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.505174 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.509231 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.509442 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-zdjxx" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.543091 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.579641 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a208ab67-07bd-43b7-8ec5-1408824103b8-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.579808 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlxnz\" (UniqueName: \"kubernetes.io/projected/a208ab67-07bd-43b7-8ec5-1408824103b8-kube-api-access-mlxnz\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.681620 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlxnz\" (UniqueName: \"kubernetes.io/projected/a208ab67-07bd-43b7-8ec5-1408824103b8-kube-api-access-mlxnz\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.681776 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a208ab67-07bd-43b7-8ec5-1408824103b8-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.694172 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a208ab67-07bd-43b7-8ec5-1408824103b8-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.716524 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlxnz\" (UniqueName: \"kubernetes.io/projected/a208ab67-07bd-43b7-8ec5-1408824103b8-kube-api-access-mlxnz\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:02 crc kubenswrapper[4842]: I0311 19:15:02.829442 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.291837 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:15:03 crc kubenswrapper[4842]: W0311 19:15:03.300591 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda208ab67_07bd_43b7_8ec5_1408824103b8.slice/crio-b5e049d5ae6248c8a4de6a9a73c751874a98ef4d18e313dc0f81119b88c2c51d WatchSource:0}: Error finding container b5e049d5ae6248c8a4de6a9a73c751874a98ef4d18e313dc0f81119b88c2c51d: Status 404 returned error can't find the container with id b5e049d5ae6248c8a4de6a9a73c751874a98ef4d18e313dc0f81119b88c2c51d Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.419136 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"a208ab67-07bd-43b7-8ec5-1408824103b8","Type":"ContainerStarted","Data":"b5e049d5ae6248c8a4de6a9a73c751874a98ef4d18e313dc0f81119b88c2c51d"} Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.747708 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.908258 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv4bh\" (UniqueName: \"kubernetes.io/projected/8bfb88cd-421b-4657-be01-65063e13a247-kube-api-access-cv4bh\") pod \"8bfb88cd-421b-4657-be01-65063e13a247\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.908432 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bfb88cd-421b-4657-be01-65063e13a247-secret-volume\") pod \"8bfb88cd-421b-4657-be01-65063e13a247\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.908539 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bfb88cd-421b-4657-be01-65063e13a247-config-volume\") pod \"8bfb88cd-421b-4657-be01-65063e13a247\" (UID: \"8bfb88cd-421b-4657-be01-65063e13a247\") " Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.910866 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bfb88cd-421b-4657-be01-65063e13a247-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bfb88cd-421b-4657-be01-65063e13a247" (UID: "8bfb88cd-421b-4657-be01-65063e13a247"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.913860 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bfb88cd-421b-4657-be01-65063e13a247-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8bfb88cd-421b-4657-be01-65063e13a247" (UID: "8bfb88cd-421b-4657-be01-65063e13a247"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:03 crc kubenswrapper[4842]: I0311 19:15:03.914006 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bfb88cd-421b-4657-be01-65063e13a247-kube-api-access-cv4bh" (OuterVolumeSpecName: "kube-api-access-cv4bh") pod "8bfb88cd-421b-4657-be01-65063e13a247" (UID: "8bfb88cd-421b-4657-be01-65063e13a247"). InnerVolumeSpecName "kube-api-access-cv4bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.010584 4842 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bfb88cd-421b-4657-be01-65063e13a247-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.010616 4842 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bfb88cd-421b-4657-be01-65063e13a247-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.010626 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv4bh\" (UniqueName: \"kubernetes.io/projected/8bfb88cd-421b-4657-be01-65063e13a247-kube-api-access-cv4bh\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.430813 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" event={"ID":"8bfb88cd-421b-4657-be01-65063e13a247","Type":"ContainerDied","Data":"625dd5adac575ac702f4aeaad3c9f7a0edb8662bc4ccef29363d0d4dbef2f8fa"} Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.431868 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625dd5adac575ac702f4aeaad3c9f7a0edb8662bc4ccef29363d0d4dbef2f8fa" Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.430827 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554275-bc8w4" Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.433163 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"a208ab67-07bd-43b7-8ec5-1408824103b8","Type":"ContainerStarted","Data":"4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216"} Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.433201 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:04 crc kubenswrapper[4842]: I0311 19:15:04.454736 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=2.454721671 podStartE2EDuration="2.454721671s" podCreationTimestamp="2026-03-11 19:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:04.453816307 +0000 UTC m=+1550.101512587" watchObservedRunningTime="2026-03-11 19:15:04.454721671 +0000 UTC m=+1550.102417951" Mar 11 19:15:12 crc kubenswrapper[4842]: I0311 19:15:12.861886 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.485948 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n"] Mar 11 19:15:13 crc kubenswrapper[4842]: E0311 19:15:13.497933 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfb88cd-421b-4657-be01-65063e13a247" containerName="collect-profiles" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.497971 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfb88cd-421b-4657-be01-65063e13a247" containerName="collect-profiles" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.498187 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bfb88cd-421b-4657-be01-65063e13a247" containerName="collect-profiles" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.499071 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.500133 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.502494 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-config-data" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.506252 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-scripts" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.662101 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-scripts\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.662151 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286cf\" (UniqueName: \"kubernetes.io/projected/cd150eed-cae1-4a99-a90f-d533d05070bf-kube-api-access-286cf\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.662342 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-config-data\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.752340 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.753668 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.760569 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.763809 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.764377 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-scripts\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.764844 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.765217 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-286cf\" (UniqueName: \"kubernetes.io/projected/cd150eed-cae1-4a99-a90f-d533d05070bf-kube-api-access-286cf\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.765385 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-config-data\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.773038 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-scripts\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.780354 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.786520 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.788237 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-config-data\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.804701 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.812850 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-286cf\" (UniqueName: \"kubernetes.io/projected/cd150eed-cae1-4a99-a90f-d533d05070bf-kube-api-access-286cf\") pod \"nova-kuttl-cell0-cell-mapping-8fd8n\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.817638 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.847843 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.849254 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.851343 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.853770 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.869002 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9963ae2-f79a-41fe-bc41-20222f48c6e6-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.869050 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plzx\" (UniqueName: \"kubernetes.io/projected/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-kube-api-access-8plzx\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.869073 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-config-data\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.869121 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-logs\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.869141 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhbs\" (UniqueName: \"kubernetes.io/projected/f9963ae2-f79a-41fe-bc41-20222f48c6e6-kube-api-access-8lhbs\") pod \"nova-kuttl-scheduler-0\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.969785 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970541 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-logs\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970586 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhbs\" (UniqueName: \"kubernetes.io/projected/f9963ae2-f79a-41fe-bc41-20222f48c6e6-kube-api-access-8lhbs\") pod \"nova-kuttl-scheduler-0\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970618 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f55c740-f76d-4d07-8b16-2414d33494e7-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970666 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hdx5\" (UniqueName: \"kubernetes.io/projected/1f55c740-f76d-4d07-8b16-2414d33494e7-kube-api-access-8hdx5\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970695 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f55c740-f76d-4d07-8b16-2414d33494e7-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970726 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9963ae2-f79a-41fe-bc41-20222f48c6e6-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970749 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plzx\" (UniqueName: \"kubernetes.io/projected/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-kube-api-access-8plzx\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970766 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-config-data\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.970811 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.971183 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-logs\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.976398 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9963ae2-f79a-41fe-bc41-20222f48c6e6-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.979549 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-novncproxy-config-data" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.986197 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.993552 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhbs\" (UniqueName: \"kubernetes.io/projected/f9963ae2-f79a-41fe-bc41-20222f48c6e6-kube-api-access-8lhbs\") pod \"nova-kuttl-scheduler-0\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:13 crc kubenswrapper[4842]: I0311 19:15:13.996557 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-config-data\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.015628 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plzx\" (UniqueName: \"kubernetes.io/projected/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-kube-api-access-8plzx\") pod \"nova-kuttl-api-0\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.073489 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f55c740-f76d-4d07-8b16-2414d33494e7-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.073830 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.073882 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hdx5\" (UniqueName: \"kubernetes.io/projected/1f55c740-f76d-4d07-8b16-2414d33494e7-kube-api-access-8hdx5\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.073906 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f55c740-f76d-4d07-8b16-2414d33494e7-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.074105 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5259\" (UniqueName: \"kubernetes.io/projected/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-kube-api-access-p5259\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.074503 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f55c740-f76d-4d07-8b16-2414d33494e7-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.080543 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f55c740-f76d-4d07-8b16-2414d33494e7-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.089731 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hdx5\" (UniqueName: \"kubernetes.io/projected/1f55c740-f76d-4d07-8b16-2414d33494e7-kube-api-access-8hdx5\") pod \"nova-kuttl-metadata-0\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.175715 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.175819 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5259\" (UniqueName: \"kubernetes.io/projected/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-kube-api-access-p5259\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.181730 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.192501 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5259\" (UniqueName: \"kubernetes.io/projected/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-kube-api-access-p5259\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.214489 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.231413 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.254185 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.293960 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.433739 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n"] Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.455430 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp"] Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.456704 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.460589 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-scripts" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.460641 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.487359 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp"] Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.535112 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.535437 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" event={"ID":"cd150eed-cae1-4a99-a90f-d533d05070bf","Type":"ContainerStarted","Data":"46bf3fbf4a5cd10ddbdefe60d9f2fafd75afdc203edbdc67621721e1033bd679"} Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.567385 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.584382 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.584428 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.584473 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5prrv\" (UniqueName: \"kubernetes.io/projected/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-kube-api-access-5prrv\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.685967 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.686011 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.686040 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5prrv\" (UniqueName: \"kubernetes.io/projected/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-kube-api-access-5prrv\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.690065 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.690187 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.707304 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5prrv\" (UniqueName: \"kubernetes.io/projected/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-kube-api-access-5prrv\") pod \"nova-kuttl-cell1-conductor-db-sync-q7bdp\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.781772 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.852233 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:15:14 crc kubenswrapper[4842]: I0311 19:15:14.858655 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:14 crc kubenswrapper[4842]: W0311 19:15:14.867737 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c1be6f_1e2e_40e1_95c6_831ccd3a9239.slice/crio-cf71cb2203d591d7a8cafe334a8a007f576ff46b625747c572b090c668d5530f WatchSource:0}: Error finding container cf71cb2203d591d7a8cafe334a8a007f576ff46b625747c572b090c668d5530f: Status 404 returned error can't find the container with id cf71cb2203d591d7a8cafe334a8a007f576ff46b625747c572b090c668d5530f Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:14.992153 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:15:15 crc kubenswrapper[4842]: E0311 19:15:14.992439 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.271311 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp"] Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.544869 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" event={"ID":"55af88a0-a4e9-40e5-8e80-a38d11f5da3f","Type":"ContainerStarted","Data":"5ed25c94a4ce838627dccb27f3a92ad987f8c96a38b142186bc118b557a100a0"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.544915 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" event={"ID":"55af88a0-a4e9-40e5-8e80-a38d11f5da3f","Type":"ContainerStarted","Data":"7f089247c268b4bb84fb6e747a9ab5d850fa647c7ad4ce3eb84e75596b3cfb25"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.549357 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1f55c740-f76d-4d07-8b16-2414d33494e7","Type":"ContainerStarted","Data":"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.549429 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1f55c740-f76d-4d07-8b16-2414d33494e7","Type":"ContainerStarted","Data":"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.549447 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1f55c740-f76d-4d07-8b16-2414d33494e7","Type":"ContainerStarted","Data":"e5a2c16072629cfac8cee00e5475d7f5df27fbd64b497de58909eb3564d60dc7"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.556911 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"f9963ae2-f79a-41fe-bc41-20222f48c6e6","Type":"ContainerStarted","Data":"273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.556969 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"f9963ae2-f79a-41fe-bc41-20222f48c6e6","Type":"ContainerStarted","Data":"791bb66530a4dbc8d9bcf534d4c398353ea4c832eefd1f20edf67171ac44981f"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.558633 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" event={"ID":"cd150eed-cae1-4a99-a90f-d533d05070bf","Type":"ContainerStarted","Data":"9b41acb1732e8d27ffba758b7a44e575b9835935296b698ba4017583310a1799"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.560539 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d","Type":"ContainerStarted","Data":"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.560567 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d","Type":"ContainerStarted","Data":"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.560587 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d","Type":"ContainerStarted","Data":"6b401f76a382b783ff84b5bb7fcf3f25add3ac11ccc50104e49ce0edcc2be9f1"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.563911 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239","Type":"ContainerStarted","Data":"31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.564045 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239","Type":"ContainerStarted","Data":"cf71cb2203d591d7a8cafe334a8a007f576ff46b625747c572b090c668d5530f"} Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.569850 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" podStartSLOduration=1.569824517 podStartE2EDuration="1.569824517s" podCreationTimestamp="2026-03-11 19:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:15.561521938 +0000 UTC m=+1561.209218218" watchObservedRunningTime="2026-03-11 19:15:15.569824517 +0000 UTC m=+1561.217520807" Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.584835 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.584816031 podStartE2EDuration="2.584816031s" podCreationTimestamp="2026-03-11 19:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:15.579527022 +0000 UTC m=+1561.227223302" watchObservedRunningTime="2026-03-11 19:15:15.584816031 +0000 UTC m=+1561.232512311" Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.643495 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" podStartSLOduration=2.6434605639999997 podStartE2EDuration="2.643460564s" podCreationTimestamp="2026-03-11 19:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:15.594402633 +0000 UTC m=+1561.242098923" watchObservedRunningTime="2026-03-11 19:15:15.643460564 +0000 UTC m=+1561.291156844" Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.664148 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.664108567 podStartE2EDuration="2.664108567s" podCreationTimestamp="2026-03-11 19:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:15.62240616 +0000 UTC m=+1561.270102450" watchObservedRunningTime="2026-03-11 19:15:15.664108567 +0000 UTC m=+1561.311804847" Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.669231 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.669212071 podStartE2EDuration="2.669212071s" podCreationTimestamp="2026-03-11 19:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:15.651637709 +0000 UTC m=+1561.299334009" watchObservedRunningTime="2026-03-11 19:15:15.669212071 +0000 UTC m=+1561.316908351" Mar 11 19:15:15 crc kubenswrapper[4842]: I0311 19:15:15.683026 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podStartSLOduration=2.683000034 podStartE2EDuration="2.683000034s" podCreationTimestamp="2026-03-11 19:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:15.673889494 +0000 UTC m=+1561.321585784" watchObservedRunningTime="2026-03-11 19:15:15.683000034 +0000 UTC m=+1561.330696314" Mar 11 19:15:18 crc kubenswrapper[4842]: I0311 19:15:18.600242 4842 generic.go:334] "Generic (PLEG): container finished" podID="55af88a0-a4e9-40e5-8e80-a38d11f5da3f" containerID="5ed25c94a4ce838627dccb27f3a92ad987f8c96a38b142186bc118b557a100a0" exitCode=0 Mar 11 19:15:18 crc kubenswrapper[4842]: I0311 19:15:18.600351 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" event={"ID":"55af88a0-a4e9-40e5-8e80-a38d11f5da3f","Type":"ContainerDied","Data":"5ed25c94a4ce838627dccb27f3a92ad987f8c96a38b142186bc118b557a100a0"} Mar 11 19:15:19 crc kubenswrapper[4842]: I0311 19:15:19.232629 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:19 crc kubenswrapper[4842]: I0311 19:15:19.294497 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:19 crc kubenswrapper[4842]: I0311 19:15:19.941991 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.090878 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-scripts\") pod \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.091697 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5prrv\" (UniqueName: \"kubernetes.io/projected/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-kube-api-access-5prrv\") pod \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.091811 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-config-data\") pod \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\" (UID: \"55af88a0-a4e9-40e5-8e80-a38d11f5da3f\") " Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.098229 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-scripts" (OuterVolumeSpecName: "scripts") pod "55af88a0-a4e9-40e5-8e80-a38d11f5da3f" (UID: "55af88a0-a4e9-40e5-8e80-a38d11f5da3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.103592 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-kube-api-access-5prrv" (OuterVolumeSpecName: "kube-api-access-5prrv") pod "55af88a0-a4e9-40e5-8e80-a38d11f5da3f" (UID: "55af88a0-a4e9-40e5-8e80-a38d11f5da3f"). InnerVolumeSpecName "kube-api-access-5prrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.123840 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-config-data" (OuterVolumeSpecName: "config-data") pod "55af88a0-a4e9-40e5-8e80-a38d11f5da3f" (UID: "55af88a0-a4e9-40e5-8e80-a38d11f5da3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.204841 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5prrv\" (UniqueName: \"kubernetes.io/projected/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-kube-api-access-5prrv\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.204903 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.204918 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55af88a0-a4e9-40e5-8e80-a38d11f5da3f-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.631860 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" event={"ID":"55af88a0-a4e9-40e5-8e80-a38d11f5da3f","Type":"ContainerDied","Data":"7f089247c268b4bb84fb6e747a9ab5d850fa647c7ad4ce3eb84e75596b3cfb25"} Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.631927 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f089247c268b4bb84fb6e747a9ab5d850fa647c7ad4ce3eb84e75596b3cfb25" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.631968 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.634776 4842 generic.go:334] "Generic (PLEG): container finished" podID="cd150eed-cae1-4a99-a90f-d533d05070bf" containerID="9b41acb1732e8d27ffba758b7a44e575b9835935296b698ba4017583310a1799" exitCode=0 Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.634836 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" event={"ID":"cd150eed-cae1-4a99-a90f-d533d05070bf","Type":"ContainerDied","Data":"9b41acb1732e8d27ffba758b7a44e575b9835935296b698ba4017583310a1799"} Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.778427 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:15:20 crc kubenswrapper[4842]: E0311 19:15:20.779078 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af88a0-a4e9-40e5-8e80-a38d11f5da3f" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.779102 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af88a0-a4e9-40e5-8e80-a38d11f5da3f" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.779349 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af88a0-a4e9-40e5-8e80-a38d11f5da3f" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.780318 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.782606 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.803484 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.922593 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57nwb\" (UniqueName: \"kubernetes.io/projected/a2088551-e0e8-474b-b472-63b60ca972c0-kube-api-access-57nwb\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:20 crc kubenswrapper[4842]: I0311 19:15:20.922659 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2088551-e0e8-474b-b472-63b60ca972c0-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.024407 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57nwb\" (UniqueName: \"kubernetes.io/projected/a2088551-e0e8-474b-b472-63b60ca972c0-kube-api-access-57nwb\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.024470 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2088551-e0e8-474b-b472-63b60ca972c0-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.028828 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2088551-e0e8-474b-b472-63b60ca972c0-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.040776 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57nwb\" (UniqueName: \"kubernetes.io/projected/a2088551-e0e8-474b-b472-63b60ca972c0-kube-api-access-57nwb\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.101204 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.591353 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.650625 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"a2088551-e0e8-474b-b472-63b60ca972c0","Type":"ContainerStarted","Data":"4d36a51eab6e063063ac7a90cfa4349baa87a4518b4d3d52075961d71a8e6e5c"} Mar 11 19:15:21 crc kubenswrapper[4842]: I0311 19:15:21.871793 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.056447 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-scripts\") pod \"cd150eed-cae1-4a99-a90f-d533d05070bf\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.056561 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-286cf\" (UniqueName: \"kubernetes.io/projected/cd150eed-cae1-4a99-a90f-d533d05070bf-kube-api-access-286cf\") pod \"cd150eed-cae1-4a99-a90f-d533d05070bf\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.056714 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-config-data\") pod \"cd150eed-cae1-4a99-a90f-d533d05070bf\" (UID: \"cd150eed-cae1-4a99-a90f-d533d05070bf\") " Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.062304 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd150eed-cae1-4a99-a90f-d533d05070bf-kube-api-access-286cf" (OuterVolumeSpecName: "kube-api-access-286cf") pod "cd150eed-cae1-4a99-a90f-d533d05070bf" (UID: "cd150eed-cae1-4a99-a90f-d533d05070bf"). InnerVolumeSpecName "kube-api-access-286cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.062484 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-scripts" (OuterVolumeSpecName: "scripts") pod "cd150eed-cae1-4a99-a90f-d533d05070bf" (UID: "cd150eed-cae1-4a99-a90f-d533d05070bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.084542 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-config-data" (OuterVolumeSpecName: "config-data") pod "cd150eed-cae1-4a99-a90f-d533d05070bf" (UID: "cd150eed-cae1-4a99-a90f-d533d05070bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.158178 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.158211 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd150eed-cae1-4a99-a90f-d533d05070bf-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.158222 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-286cf\" (UniqueName: \"kubernetes.io/projected/cd150eed-cae1-4a99-a90f-d533d05070bf-kube-api-access-286cf\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.660834 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" event={"ID":"cd150eed-cae1-4a99-a90f-d533d05070bf","Type":"ContainerDied","Data":"46bf3fbf4a5cd10ddbdefe60d9f2fafd75afdc203edbdc67621721e1033bd679"} Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.661310 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.673455 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46bf3fbf4a5cd10ddbdefe60d9f2fafd75afdc203edbdc67621721e1033bd679" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.673494 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"a2088551-e0e8-474b-b472-63b60ca972c0","Type":"ContainerStarted","Data":"586fd9696e7e5d4b8dba356243d03e667d6737a2bbb02c29994a7397c74731da"} Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.673523 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.692516 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=2.692490167 podStartE2EDuration="2.692490167s" podCreationTimestamp="2026-03-11 19:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:22.67701551 +0000 UTC m=+1568.324711790" watchObservedRunningTime="2026-03-11 19:15:22.692490167 +0000 UTC m=+1568.340186477" Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.927197 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.927758 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-log" containerID="cri-o://6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc" gracePeriod=30 Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.927944 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-api" containerID="cri-o://bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd" gracePeriod=30 Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.947466 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.947756 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="f9963ae2-f79a-41fe-bc41-20222f48c6e6" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca" gracePeriod=30 Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.975131 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.975502 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-log" containerID="cri-o://238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9" gracePeriod=30 Mar 11 19:15:22 crc kubenswrapper[4842]: I0311 19:15:22.975580 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e" gracePeriod=30 Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.492971 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.519994 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8plzx\" (UniqueName: \"kubernetes.io/projected/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-kube-api-access-8plzx\") pod \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.520074 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-logs\") pod \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.520165 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-config-data\") pod \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\" (UID: \"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d\") " Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.521299 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-logs" (OuterVolumeSpecName: "logs") pod "7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" (UID: "7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.526038 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-kube-api-access-8plzx" (OuterVolumeSpecName: "kube-api-access-8plzx") pod "7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" (UID: "7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d"). InnerVolumeSpecName "kube-api-access-8plzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.547603 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-config-data" (OuterVolumeSpecName: "config-data") pod "7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" (UID: "7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.581570 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.621682 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.621718 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8plzx\" (UniqueName: \"kubernetes.io/projected/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-kube-api-access-8plzx\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.621732 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.672056 4842 generic.go:334] "Generic (PLEG): container finished" podID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerID="bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd" exitCode=0 Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.672088 4842 generic.go:334] "Generic (PLEG): container finished" podID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerID="6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc" exitCode=143 Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.672140 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d","Type":"ContainerDied","Data":"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd"} Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.672190 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d","Type":"ContainerDied","Data":"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc"} Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.672206 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d","Type":"ContainerDied","Data":"6b401f76a382b783ff84b5bb7fcf3f25add3ac11ccc50104e49ce0edcc2be9f1"} Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.672226 4842 scope.go:117] "RemoveContainer" containerID="bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.672380 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.675133 4842 generic.go:334] "Generic (PLEG): container finished" podID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerID="c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e" exitCode=0 Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.675161 4842 generic.go:334] "Generic (PLEG): container finished" podID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerID="238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9" exitCode=143 Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.675461 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.675658 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1f55c740-f76d-4d07-8b16-2414d33494e7","Type":"ContainerDied","Data":"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e"} Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.675702 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1f55c740-f76d-4d07-8b16-2414d33494e7","Type":"ContainerDied","Data":"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9"} Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.675712 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1f55c740-f76d-4d07-8b16-2414d33494e7","Type":"ContainerDied","Data":"e5a2c16072629cfac8cee00e5475d7f5df27fbd64b497de58909eb3564d60dc7"} Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.711663 4842 scope.go:117] "RemoveContainer" containerID="6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.718957 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.723647 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f55c740-f76d-4d07-8b16-2414d33494e7-config-data\") pod \"1f55c740-f76d-4d07-8b16-2414d33494e7\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.723753 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f55c740-f76d-4d07-8b16-2414d33494e7-logs\") pod \"1f55c740-f76d-4d07-8b16-2414d33494e7\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.723864 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hdx5\" (UniqueName: \"kubernetes.io/projected/1f55c740-f76d-4d07-8b16-2414d33494e7-kube-api-access-8hdx5\") pod \"1f55c740-f76d-4d07-8b16-2414d33494e7\" (UID: \"1f55c740-f76d-4d07-8b16-2414d33494e7\") " Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.725340 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f55c740-f76d-4d07-8b16-2414d33494e7-logs" (OuterVolumeSpecName: "logs") pod "1f55c740-f76d-4d07-8b16-2414d33494e7" (UID: "1f55c740-f76d-4d07-8b16-2414d33494e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.727488 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f55c740-f76d-4d07-8b16-2414d33494e7-kube-api-access-8hdx5" (OuterVolumeSpecName: "kube-api-access-8hdx5") pod "1f55c740-f76d-4d07-8b16-2414d33494e7" (UID: "1f55c740-f76d-4d07-8b16-2414d33494e7"). InnerVolumeSpecName "kube-api-access-8hdx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.733326 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.733454 4842 scope.go:117] "RemoveContainer" containerID="bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.734676 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd\": container with ID starting with bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd not found: ID does not exist" containerID="bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.734709 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd"} err="failed to get container status \"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd\": rpc error: code = NotFound desc = could not find container \"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd\": container with ID starting with bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.734731 4842 scope.go:117] "RemoveContainer" containerID="6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.736020 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc\": container with ID starting with 6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc not found: ID does not exist" containerID="6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.736046 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc"} err="failed to get container status \"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc\": rpc error: code = NotFound desc = could not find container \"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc\": container with ID starting with 6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.736062 4842 scope.go:117] "RemoveContainer" containerID="bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.738407 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd"} err="failed to get container status \"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd\": rpc error: code = NotFound desc = could not find container \"bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd\": container with ID starting with bd964a5eec1e83863d001ca8b110f5f1463725053d6a09a35398c4ecd73306cd not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.738453 4842 scope.go:117] "RemoveContainer" containerID="6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.742385 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc"} err="failed to get container status \"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc\": rpc error: code = NotFound desc = could not find container \"6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc\": container with ID starting with 6d2df045f03147d95a659118db8cdd1d6f0af1d2387e7d70cf827aa8c3d0b1fc not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.742416 4842 scope.go:117] "RemoveContainer" containerID="c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.752420 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.752865 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-log" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.752884 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-log" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.752894 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-metadata" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.752901 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-metadata" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.752921 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-log" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.752929 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-log" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.752937 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-api" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.752943 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-api" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.752988 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd150eed-cae1-4a99-a90f-d533d05070bf" containerName="nova-manage" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.752994 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd150eed-cae1-4a99-a90f-d533d05070bf" containerName="nova-manage" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.753154 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd150eed-cae1-4a99-a90f-d533d05070bf" containerName="nova-manage" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.753170 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-metadata" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.753185 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-api" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.753194 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" containerName="nova-kuttl-api-log" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.753205 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" containerName="nova-kuttl-metadata-log" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.754200 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.756473 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f55c740-f76d-4d07-8b16-2414d33494e7-config-data" (OuterVolumeSpecName: "config-data") pod "1f55c740-f76d-4d07-8b16-2414d33494e7" (UID: "1f55c740-f76d-4d07-8b16-2414d33494e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.758732 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.760003 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.782355 4842 scope.go:117] "RemoveContainer" containerID="238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.803064 4842 scope.go:117] "RemoveContainer" containerID="c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.803830 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e\": container with ID starting with c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e not found: ID does not exist" containerID="c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.803949 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e"} err="failed to get container status \"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e\": rpc error: code = NotFound desc = could not find container \"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e\": container with ID starting with c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.803996 4842 scope.go:117] "RemoveContainer" containerID="238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9" Mar 11 19:15:23 crc kubenswrapper[4842]: E0311 19:15:23.804556 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9\": container with ID starting with 238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9 not found: ID does not exist" containerID="238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.804594 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9"} err="failed to get container status \"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9\": rpc error: code = NotFound desc = could not find container \"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9\": container with ID starting with 238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9 not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.804621 4842 scope.go:117] "RemoveContainer" containerID="c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.804884 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e"} err="failed to get container status \"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e\": rpc error: code = NotFound desc = could not find container \"c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e\": container with ID starting with c3d589236b25cd57c8ff72be827f195e6c1183bd8f989152fc9637cd460cdf2e not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.804903 4842 scope.go:117] "RemoveContainer" containerID="238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.805188 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9"} err="failed to get container status \"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9\": rpc error: code = NotFound desc = could not find container \"238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9\": container with ID starting with 238446d653ffbb6d67b37fe3c280352112b34457a5d36afb6561f6fa24e1aac9 not found: ID does not exist" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.825345 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hdx5\" (UniqueName: \"kubernetes.io/projected/1f55c740-f76d-4d07-8b16-2414d33494e7-kube-api-access-8hdx5\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.825378 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f55c740-f76d-4d07-8b16-2414d33494e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.825388 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f55c740-f76d-4d07-8b16-2414d33494e7-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.926445 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef4962d-9955-4c1f-974b-3c684f014905-config-data\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.926601 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ef4962d-9955-4c1f-974b-3c684f014905-logs\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:23 crc kubenswrapper[4842]: I0311 19:15:23.926730 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvwq\" (UniqueName: \"kubernetes.io/projected/0ef4962d-9955-4c1f-974b-3c684f014905-kube-api-access-zxvwq\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.012524 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.024147 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.028557 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef4962d-9955-4c1f-974b-3c684f014905-config-data\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.028652 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ef4962d-9955-4c1f-974b-3c684f014905-logs\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.028699 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvwq\" (UniqueName: \"kubernetes.io/projected/0ef4962d-9955-4c1f-974b-3c684f014905-kube-api-access-zxvwq\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.029835 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ef4962d-9955-4c1f-974b-3c684f014905-logs\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.034991 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.035383 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef4962d-9955-4c1f-974b-3c684f014905-config-data\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.036997 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.039010 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.057018 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.066023 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvwq\" (UniqueName: \"kubernetes.io/projected/0ef4962d-9955-4c1f-974b-3c684f014905-kube-api-access-zxvwq\") pod \"nova-kuttl-api-0\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.080079 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.232837 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b555378-334c-49fb-ae66-aaea28f87bc3-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.232885 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dh9\" (UniqueName: \"kubernetes.io/projected/1b555378-334c-49fb-ae66-aaea28f87bc3-kube-api-access-t9dh9\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.232910 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b555378-334c-49fb-ae66-aaea28f87bc3-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.294808 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.313201 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.335263 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b555378-334c-49fb-ae66-aaea28f87bc3-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.335418 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dh9\" (UniqueName: \"kubernetes.io/projected/1b555378-334c-49fb-ae66-aaea28f87bc3-kube-api-access-t9dh9\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.335481 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b555378-334c-49fb-ae66-aaea28f87bc3-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.336978 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b555378-334c-49fb-ae66-aaea28f87bc3-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.347198 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b555378-334c-49fb-ae66-aaea28f87bc3-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.359432 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dh9\" (UniqueName: \"kubernetes.io/projected/1b555378-334c-49fb-ae66-aaea28f87bc3-kube-api-access-t9dh9\") pod \"nova-kuttl-metadata-0\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.369017 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.486312 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.685662 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"0ef4962d-9955-4c1f-974b-3c684f014905","Type":"ContainerStarted","Data":"6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9"} Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.685718 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"0ef4962d-9955-4c1f-974b-3c684f014905","Type":"ContainerStarted","Data":"341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761"} Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.685732 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"0ef4962d-9955-4c1f-974b-3c684f014905","Type":"ContainerStarted","Data":"950fc2e3d3acd8677fbd138cdac80e565223f6f95f6fd27ff7a4bebaff4b3b39"} Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.705781 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.708032 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=1.708014818 podStartE2EDuration="1.708014818s" podCreationTimestamp="2026-03-11 19:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:24.701328102 +0000 UTC m=+1570.349024392" watchObservedRunningTime="2026-03-11 19:15:24.708014818 +0000 UTC m=+1570.355711098" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.898138 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:24 crc kubenswrapper[4842]: W0311 19:15:24.899421 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b555378_334c_49fb_ae66_aaea28f87bc3.slice/crio-dd201877bed58a39c1807be1605e8fd143311a6b9023ed348b0987f92b3681b4 WatchSource:0}: Error finding container dd201877bed58a39c1807be1605e8fd143311a6b9023ed348b0987f92b3681b4: Status 404 returned error can't find the container with id dd201877bed58a39c1807be1605e8fd143311a6b9023ed348b0987f92b3681b4 Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.972450 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f55c740-f76d-4d07-8b16-2414d33494e7" path="/var/lib/kubelet/pods/1f55c740-f76d-4d07-8b16-2414d33494e7/volumes" Mar 11 19:15:24 crc kubenswrapper[4842]: I0311 19:15:24.973598 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d" path="/var/lib/kubelet/pods/7b9ffcc6-8144-4ec4-ae68-e1cc8d90592d/volumes" Mar 11 19:15:25 crc kubenswrapper[4842]: I0311 19:15:25.717240 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1b555378-334c-49fb-ae66-aaea28f87bc3","Type":"ContainerStarted","Data":"061bf07e26c8e75c5a6477c3ac16f6ad4dd82e0726171955073f1beb93162e80"} Mar 11 19:15:25 crc kubenswrapper[4842]: I0311 19:15:25.718158 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1b555378-334c-49fb-ae66-aaea28f87bc3","Type":"ContainerStarted","Data":"04595c4cba0cbe1716c356e578ff6099c34682b03ff823799257cc83edc57bc3"} Mar 11 19:15:25 crc kubenswrapper[4842]: I0311 19:15:25.718179 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1b555378-334c-49fb-ae66-aaea28f87bc3","Type":"ContainerStarted","Data":"dd201877bed58a39c1807be1605e8fd143311a6b9023ed348b0987f92b3681b4"} Mar 11 19:15:25 crc kubenswrapper[4842]: I0311 19:15:25.745516 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=1.7454826589999999 podStartE2EDuration="1.745482659s" podCreationTimestamp="2026-03-11 19:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:25.742707996 +0000 UTC m=+1571.390404296" watchObservedRunningTime="2026-03-11 19:15:25.745482659 +0000 UTC m=+1571.393179119" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.139868 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.444434 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.572934 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9963ae2-f79a-41fe-bc41-20222f48c6e6-config-data\") pod \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.573062 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lhbs\" (UniqueName: \"kubernetes.io/projected/f9963ae2-f79a-41fe-bc41-20222f48c6e6-kube-api-access-8lhbs\") pod \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\" (UID: \"f9963ae2-f79a-41fe-bc41-20222f48c6e6\") " Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.582650 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9963ae2-f79a-41fe-bc41-20222f48c6e6-kube-api-access-8lhbs" (OuterVolumeSpecName: "kube-api-access-8lhbs") pod "f9963ae2-f79a-41fe-bc41-20222f48c6e6" (UID: "f9963ae2-f79a-41fe-bc41-20222f48c6e6"). InnerVolumeSpecName "kube-api-access-8lhbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.600418 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9963ae2-f79a-41fe-bc41-20222f48c6e6-config-data" (OuterVolumeSpecName: "config-data") pod "f9963ae2-f79a-41fe-bc41-20222f48c6e6" (UID: "f9963ae2-f79a-41fe-bc41-20222f48c6e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.671202 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x"] Mar 11 19:15:26 crc kubenswrapper[4842]: E0311 19:15:26.672904 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9963ae2-f79a-41fe-bc41-20222f48c6e6" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.672940 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9963ae2-f79a-41fe-bc41-20222f48c6e6" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.673198 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9963ae2-f79a-41fe-bc41-20222f48c6e6" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.675166 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9963ae2-f79a-41fe-bc41-20222f48c6e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.675210 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lhbs\" (UniqueName: \"kubernetes.io/projected/f9963ae2-f79a-41fe-bc41-20222f48c6e6-kube-api-access-8lhbs\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.675392 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.680332 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-scripts" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.680784 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-config-data" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.694400 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x"] Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.728573 4842 generic.go:334] "Generic (PLEG): container finished" podID="f9963ae2-f79a-41fe-bc41-20222f48c6e6" containerID="273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca" exitCode=0 Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.730127 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.730601 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"f9963ae2-f79a-41fe-bc41-20222f48c6e6","Type":"ContainerDied","Data":"273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca"} Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.730658 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"f9963ae2-f79a-41fe-bc41-20222f48c6e6","Type":"ContainerDied","Data":"791bb66530a4dbc8d9bcf534d4c398353ea4c832eefd1f20edf67171ac44981f"} Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.730680 4842 scope.go:117] "RemoveContainer" containerID="273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.762622 4842 scope.go:117] "RemoveContainer" containerID="273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca" Mar 11 19:15:26 crc kubenswrapper[4842]: E0311 19:15:26.763070 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca\": container with ID starting with 273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca not found: ID does not exist" containerID="273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.763103 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca"} err="failed to get container status \"273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca\": rpc error: code = NotFound desc = could not find container \"273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca\": container with ID starting with 273c9a81463aee9fd768013f3e59c33358474ade0f2ea15b04c5bc2ad114d6ca not found: ID does not exist" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.776470 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffpkz\" (UniqueName: \"kubernetes.io/projected/b8a127e0-ea5b-46e1-92a5-d748c5415c18-kube-api-access-ffpkz\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.776551 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-config-data\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.776633 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-scripts\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.777068 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.795457 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.819530 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.821661 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.823721 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.826917 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.879173 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffpkz\" (UniqueName: \"kubernetes.io/projected/b8a127e0-ea5b-46e1-92a5-d748c5415c18-kube-api-access-ffpkz\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.879251 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-config-data\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.879374 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-scripts\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.885321 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-config-data\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.885333 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-scripts\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.895609 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffpkz\" (UniqueName: \"kubernetes.io/projected/b8a127e0-ea5b-46e1-92a5-d748c5415c18-kube-api-access-ffpkz\") pod \"nova-kuttl-cell1-cell-mapping-nt26x\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.975744 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9963ae2-f79a-41fe-bc41-20222f48c6e6" path="/var/lib/kubelet/pods/f9963ae2-f79a-41fe-bc41-20222f48c6e6/volumes" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.981445 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhhgb\" (UniqueName: \"kubernetes.io/projected/7a1b914d-162f-476c-a571-c5d52f57aad6-kube-api-access-xhhgb\") pod \"nova-kuttl-scheduler-0\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:26 crc kubenswrapper[4842]: I0311 19:15:26.981520 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1b914d-162f-476c-a571-c5d52f57aad6-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.012367 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.084378 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhhgb\" (UniqueName: \"kubernetes.io/projected/7a1b914d-162f-476c-a571-c5d52f57aad6-kube-api-access-xhhgb\") pod \"nova-kuttl-scheduler-0\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.085353 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1b914d-162f-476c-a571-c5d52f57aad6-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.094758 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1b914d-162f-476c-a571-c5d52f57aad6-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.122631 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhhgb\" (UniqueName: \"kubernetes.io/projected/7a1b914d-162f-476c-a571-c5d52f57aad6-kube-api-access-xhhgb\") pod \"nova-kuttl-scheduler-0\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.143654 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.474872 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x"] Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.585802 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.590355 4842 scope.go:117] "RemoveContainer" containerID="9ad2378cb4700afaef5d3e57ccbe6c1db255c389eb8009a2b027691e761c4e36" Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.773711 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7a1b914d-162f-476c-a571-c5d52f57aad6","Type":"ContainerStarted","Data":"14da888a5d2d2281786c700ec6b9da2b8c334fb2796ccee6b501949df860a649"} Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.775462 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" event={"ID":"b8a127e0-ea5b-46e1-92a5-d748c5415c18","Type":"ContainerStarted","Data":"a98de39e24d997969f3f4c625aa016ed40a06b93652aa6a2a560eef9fb2ad795"} Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.775520 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" event={"ID":"b8a127e0-ea5b-46e1-92a5-d748c5415c18","Type":"ContainerStarted","Data":"707397711a353320f2a505eb4400136fe5fcd340a8fd197be0930af7cf36a312"} Mar 11 19:15:27 crc kubenswrapper[4842]: I0311 19:15:27.797672 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" podStartSLOduration=1.7976528950000001 podStartE2EDuration="1.797652895s" podCreationTimestamp="2026-03-11 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:27.795740754 +0000 UTC m=+1573.443437034" watchObservedRunningTime="2026-03-11 19:15:27.797652895 +0000 UTC m=+1573.445349185" Mar 11 19:15:28 crc kubenswrapper[4842]: I0311 19:15:28.789602 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7a1b914d-162f-476c-a571-c5d52f57aad6","Type":"ContainerStarted","Data":"b1a0f1aa477c65b2fb09e651f9b4cbfca0baa0134da2d2d620acab155004d0a1"} Mar 11 19:15:28 crc kubenswrapper[4842]: I0311 19:15:28.809988 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.809964915 podStartE2EDuration="2.809964915s" podCreationTimestamp="2026-03-11 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:28.808144807 +0000 UTC m=+1574.455841107" watchObservedRunningTime="2026-03-11 19:15:28.809964915 +0000 UTC m=+1574.457661215" Mar 11 19:15:28 crc kubenswrapper[4842]: I0311 19:15:28.968986 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:15:28 crc kubenswrapper[4842]: E0311 19:15:28.969224 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:15:32 crc kubenswrapper[4842]: I0311 19:15:32.144249 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:32 crc kubenswrapper[4842]: I0311 19:15:32.822179 4842 generic.go:334] "Generic (PLEG): container finished" podID="b8a127e0-ea5b-46e1-92a5-d748c5415c18" containerID="a98de39e24d997969f3f4c625aa016ed40a06b93652aa6a2a560eef9fb2ad795" exitCode=0 Mar 11 19:15:32 crc kubenswrapper[4842]: I0311 19:15:32.822230 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" event={"ID":"b8a127e0-ea5b-46e1-92a5-d748c5415c18","Type":"ContainerDied","Data":"a98de39e24d997969f3f4c625aa016ed40a06b93652aa6a2a560eef9fb2ad795"} Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.081458 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.082202 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.132160 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.307765 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-scripts\") pod \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.308300 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffpkz\" (UniqueName: \"kubernetes.io/projected/b8a127e0-ea5b-46e1-92a5-d748c5415c18-kube-api-access-ffpkz\") pod \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.308349 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-config-data\") pod \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\" (UID: \"b8a127e0-ea5b-46e1-92a5-d748c5415c18\") " Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.313478 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-scripts" (OuterVolumeSpecName: "scripts") pod "b8a127e0-ea5b-46e1-92a5-d748c5415c18" (UID: "b8a127e0-ea5b-46e1-92a5-d748c5415c18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.321494 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a127e0-ea5b-46e1-92a5-d748c5415c18-kube-api-access-ffpkz" (OuterVolumeSpecName: "kube-api-access-ffpkz") pod "b8a127e0-ea5b-46e1-92a5-d748c5415c18" (UID: "b8a127e0-ea5b-46e1-92a5-d748c5415c18"). InnerVolumeSpecName "kube-api-access-ffpkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.332386 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-config-data" (OuterVolumeSpecName: "config-data") pod "b8a127e0-ea5b-46e1-92a5-d748c5415c18" (UID: "b8a127e0-ea5b-46e1-92a5-d748c5415c18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.410738 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.410791 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffpkz\" (UniqueName: \"kubernetes.io/projected/b8a127e0-ea5b-46e1-92a5-d748c5415c18-kube-api-access-ffpkz\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.410810 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a127e0-ea5b-46e1-92a5-d748c5415c18-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.487442 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.487497 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.845449 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.845519 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x" event={"ID":"b8a127e0-ea5b-46e1-92a5-d748c5415c18","Type":"ContainerDied","Data":"707397711a353320f2a505eb4400136fe5fcd340a8fd197be0930af7cf36a312"} Mar 11 19:15:34 crc kubenswrapper[4842]: I0311 19:15:34.845564 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="707397711a353320f2a505eb4400136fe5fcd340a8fd197be0930af7cf36a312" Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.023512 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.038578 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.038882 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="7a1b914d-162f-476c-a571-c5d52f57aad6" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://b1a0f1aa477c65b2fb09e651f9b4cbfca0baa0134da2d2d620acab155004d0a1" gracePeriod=30 Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.083667 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.083954 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-log" containerID="cri-o://04595c4cba0cbe1716c356e578ff6099c34682b03ff823799257cc83edc57bc3" gracePeriod=30 Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.084039 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://061bf07e26c8e75c5a6477c3ac16f6ad4dd82e0726171955073f1beb93162e80" gracePeriod=30 Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.165444 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.173:8775/\": EOF" Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.165498 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.173:8775/\": EOF" Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.165458 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.172:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.165620 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.172:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.861947 4842 generic.go:334] "Generic (PLEG): container finished" podID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerID="04595c4cba0cbe1716c356e578ff6099c34682b03ff823799257cc83edc57bc3" exitCode=143 Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.862125 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1b555378-334c-49fb-ae66-aaea28f87bc3","Type":"ContainerDied","Data":"04595c4cba0cbe1716c356e578ff6099c34682b03ff823799257cc83edc57bc3"} Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.862398 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-log" containerID="cri-o://341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761" gracePeriod=30 Mar 11 19:15:35 crc kubenswrapper[4842]: I0311 19:15:35.862762 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-api" containerID="cri-o://6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9" gracePeriod=30 Mar 11 19:15:36 crc kubenswrapper[4842]: I0311 19:15:36.896820 4842 generic.go:334] "Generic (PLEG): container finished" podID="0ef4962d-9955-4c1f-974b-3c684f014905" containerID="341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761" exitCode=143 Mar 11 19:15:36 crc kubenswrapper[4842]: I0311 19:15:36.896983 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"0ef4962d-9955-4c1f-974b-3c684f014905","Type":"ContainerDied","Data":"341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761"} Mar 11 19:15:38 crc kubenswrapper[4842]: I0311 19:15:38.923674 4842 generic.go:334] "Generic (PLEG): container finished" podID="7a1b914d-162f-476c-a571-c5d52f57aad6" containerID="b1a0f1aa477c65b2fb09e651f9b4cbfca0baa0134da2d2d620acab155004d0a1" exitCode=0 Mar 11 19:15:38 crc kubenswrapper[4842]: I0311 19:15:38.924210 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7a1b914d-162f-476c-a571-c5d52f57aad6","Type":"ContainerDied","Data":"b1a0f1aa477c65b2fb09e651f9b4cbfca0baa0134da2d2d620acab155004d0a1"} Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.128438 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.209232 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhhgb\" (UniqueName: \"kubernetes.io/projected/7a1b914d-162f-476c-a571-c5d52f57aad6-kube-api-access-xhhgb\") pod \"7a1b914d-162f-476c-a571-c5d52f57aad6\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.209387 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1b914d-162f-476c-a571-c5d52f57aad6-config-data\") pod \"7a1b914d-162f-476c-a571-c5d52f57aad6\" (UID: \"7a1b914d-162f-476c-a571-c5d52f57aad6\") " Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.215580 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1b914d-162f-476c-a571-c5d52f57aad6-kube-api-access-xhhgb" (OuterVolumeSpecName: "kube-api-access-xhhgb") pod "7a1b914d-162f-476c-a571-c5d52f57aad6" (UID: "7a1b914d-162f-476c-a571-c5d52f57aad6"). InnerVolumeSpecName "kube-api-access-xhhgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.251038 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1b914d-162f-476c-a571-c5d52f57aad6-config-data" (OuterVolumeSpecName: "config-data") pod "7a1b914d-162f-476c-a571-c5d52f57aad6" (UID: "7a1b914d-162f-476c-a571-c5d52f57aad6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.313265 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhhgb\" (UniqueName: \"kubernetes.io/projected/7a1b914d-162f-476c-a571-c5d52f57aad6-kube-api-access-xhhgb\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.313393 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1b914d-162f-476c-a571-c5d52f57aad6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.935478 4842 generic.go:334] "Generic (PLEG): container finished" podID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerID="061bf07e26c8e75c5a6477c3ac16f6ad4dd82e0726171955073f1beb93162e80" exitCode=0 Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.935524 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1b555378-334c-49fb-ae66-aaea28f87bc3","Type":"ContainerDied","Data":"061bf07e26c8e75c5a6477c3ac16f6ad4dd82e0726171955073f1beb93162e80"} Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.939011 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7a1b914d-162f-476c-a571-c5d52f57aad6","Type":"ContainerDied","Data":"14da888a5d2d2281786c700ec6b9da2b8c334fb2796ccee6b501949df860a649"} Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.939059 4842 scope.go:117] "RemoveContainer" containerID="b1a0f1aa477c65b2fb09e651f9b4cbfca0baa0134da2d2d620acab155004d0a1" Mar 11 19:15:39 crc kubenswrapper[4842]: I0311 19:15:39.939129 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.082434 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.093119 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.123417 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.138431 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:40 crc kubenswrapper[4842]: E0311 19:15:40.139221 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-log" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139259 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-log" Mar 11 19:15:40 crc kubenswrapper[4842]: E0311 19:15:40.139321 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-metadata" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139337 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-metadata" Mar 11 19:15:40 crc kubenswrapper[4842]: E0311 19:15:40.139382 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1b914d-162f-476c-a571-c5d52f57aad6" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139398 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1b914d-162f-476c-a571-c5d52f57aad6" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:15:40 crc kubenswrapper[4842]: E0311 19:15:40.139431 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a127e0-ea5b-46e1-92a5-d748c5415c18" containerName="nova-manage" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139447 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a127e0-ea5b-46e1-92a5-d748c5415c18" containerName="nova-manage" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139817 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-metadata" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139852 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a127e0-ea5b-46e1-92a5-d748c5415c18" containerName="nova-manage" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139877 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1b914d-162f-476c-a571-c5d52f57aad6" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.139906 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" containerName="nova-kuttl-metadata-log" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.141175 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.152456 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.160873 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.249752 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b555378-334c-49fb-ae66-aaea28f87bc3-logs\") pod \"1b555378-334c-49fb-ae66-aaea28f87bc3\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.249948 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dh9\" (UniqueName: \"kubernetes.io/projected/1b555378-334c-49fb-ae66-aaea28f87bc3-kube-api-access-t9dh9\") pod \"1b555378-334c-49fb-ae66-aaea28f87bc3\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.250049 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b555378-334c-49fb-ae66-aaea28f87bc3-config-data\") pod \"1b555378-334c-49fb-ae66-aaea28f87bc3\" (UID: \"1b555378-334c-49fb-ae66-aaea28f87bc3\") " Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.250527 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b555378-334c-49fb-ae66-aaea28f87bc3-logs" (OuterVolumeSpecName: "logs") pod "1b555378-334c-49fb-ae66-aaea28f87bc3" (UID: "1b555378-334c-49fb-ae66-aaea28f87bc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.251300 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7428f2c1-f985-4599-95dc-64e530030eb0-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.251526 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhpts\" (UniqueName: \"kubernetes.io/projected/7428f2c1-f985-4599-95dc-64e530030eb0-kube-api-access-qhpts\") pod \"nova-kuttl-scheduler-0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.251910 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b555378-334c-49fb-ae66-aaea28f87bc3-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.255628 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b555378-334c-49fb-ae66-aaea28f87bc3-kube-api-access-t9dh9" (OuterVolumeSpecName: "kube-api-access-t9dh9") pod "1b555378-334c-49fb-ae66-aaea28f87bc3" (UID: "1b555378-334c-49fb-ae66-aaea28f87bc3"). InnerVolumeSpecName "kube-api-access-t9dh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.281816 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b555378-334c-49fb-ae66-aaea28f87bc3-config-data" (OuterVolumeSpecName: "config-data") pod "1b555378-334c-49fb-ae66-aaea28f87bc3" (UID: "1b555378-334c-49fb-ae66-aaea28f87bc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.353231 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7428f2c1-f985-4599-95dc-64e530030eb0-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.353325 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhpts\" (UniqueName: \"kubernetes.io/projected/7428f2c1-f985-4599-95dc-64e530030eb0-kube-api-access-qhpts\") pod \"nova-kuttl-scheduler-0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.353445 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9dh9\" (UniqueName: \"kubernetes.io/projected/1b555378-334c-49fb-ae66-aaea28f87bc3-kube-api-access-t9dh9\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.353456 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b555378-334c-49fb-ae66-aaea28f87bc3-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.358257 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7428f2c1-f985-4599-95dc-64e530030eb0-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.375721 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhpts\" (UniqueName: \"kubernetes.io/projected/7428f2c1-f985-4599-95dc-64e530030eb0-kube-api-access-qhpts\") pod \"nova-kuttl-scheduler-0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.475540 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.747489 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.861904 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef4962d-9955-4c1f-974b-3c684f014905-config-data\") pod \"0ef4962d-9955-4c1f-974b-3c684f014905\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.861989 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ef4962d-9955-4c1f-974b-3c684f014905-logs\") pod \"0ef4962d-9955-4c1f-974b-3c684f014905\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.862082 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxvwq\" (UniqueName: \"kubernetes.io/projected/0ef4962d-9955-4c1f-974b-3c684f014905-kube-api-access-zxvwq\") pod \"0ef4962d-9955-4c1f-974b-3c684f014905\" (UID: \"0ef4962d-9955-4c1f-974b-3c684f014905\") " Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.862699 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ef4962d-9955-4c1f-974b-3c684f014905-logs" (OuterVolumeSpecName: "logs") pod "0ef4962d-9955-4c1f-974b-3c684f014905" (UID: "0ef4962d-9955-4c1f-974b-3c684f014905"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.866643 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef4962d-9955-4c1f-974b-3c684f014905-kube-api-access-zxvwq" (OuterVolumeSpecName: "kube-api-access-zxvwq") pod "0ef4962d-9955-4c1f-974b-3c684f014905" (UID: "0ef4962d-9955-4c1f-974b-3c684f014905"). InnerVolumeSpecName "kube-api-access-zxvwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.883364 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef4962d-9955-4c1f-974b-3c684f014905-config-data" (OuterVolumeSpecName: "config-data") pod "0ef4962d-9955-4c1f-974b-3c684f014905" (UID: "0ef4962d-9955-4c1f-974b-3c684f014905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.947809 4842 generic.go:334] "Generic (PLEG): container finished" podID="0ef4962d-9955-4c1f-974b-3c684f014905" containerID="6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9" exitCode=0 Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.947895 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.947914 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"0ef4962d-9955-4c1f-974b-3c684f014905","Type":"ContainerDied","Data":"6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9"} Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.947943 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"0ef4962d-9955-4c1f-974b-3c684f014905","Type":"ContainerDied","Data":"950fc2e3d3acd8677fbd138cdac80e565223f6f95f6fd27ff7a4bebaff4b3b39"} Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.947959 4842 scope.go:117] "RemoveContainer" containerID="6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.951899 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"1b555378-334c-49fb-ae66-aaea28f87bc3","Type":"ContainerDied","Data":"dd201877bed58a39c1807be1605e8fd143311a6b9023ed348b0987f92b3681b4"} Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.952059 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.964316 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ef4962d-9955-4c1f-974b-3c684f014905-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.964748 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxvwq\" (UniqueName: \"kubernetes.io/projected/0ef4962d-9955-4c1f-974b-3c684f014905-kube-api-access-zxvwq\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.964763 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef4962d-9955-4c1f-974b-3c684f014905-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.976102 4842 scope.go:117] "RemoveContainer" containerID="341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761" Mar 11 19:15:40 crc kubenswrapper[4842]: I0311 19:15:40.979738 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1b914d-162f-476c-a571-c5d52f57aad6" path="/var/lib/kubelet/pods/7a1b914d-162f-476c-a571-c5d52f57aad6/volumes" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.001073 4842 scope.go:117] "RemoveContainer" containerID="6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9" Mar 11 19:15:41 crc kubenswrapper[4842]: E0311 19:15:41.002506 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9\": container with ID starting with 6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9 not found: ID does not exist" containerID="6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.002553 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9"} err="failed to get container status \"6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9\": rpc error: code = NotFound desc = could not find container \"6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9\": container with ID starting with 6afdbce575bc292c12672aa87aa67a3b046934f6a2fe4121c320915a0cc3d6b9 not found: ID does not exist" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.002661 4842 scope.go:117] "RemoveContainer" containerID="341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761" Mar 11 19:15:41 crc kubenswrapper[4842]: E0311 19:15:41.011049 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761\": container with ID starting with 341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761 not found: ID does not exist" containerID="341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.011151 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761"} err="failed to get container status \"341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761\": rpc error: code = NotFound desc = could not find container \"341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761\": container with ID starting with 341f566486389018f898ac966f5a8472c088f93bdf46600ba393ea30aa5b7761 not found: ID does not exist" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.011188 4842 scope.go:117] "RemoveContainer" containerID="061bf07e26c8e75c5a6477c3ac16f6ad4dd82e0726171955073f1beb93162e80" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.017854 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.040424 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.063535 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.076211 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.083426 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.089924 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: E0311 19:15:41.090639 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-log" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.090662 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-log" Mar 11 19:15:41 crc kubenswrapper[4842]: E0311 19:15:41.090677 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-api" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.090685 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-api" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.090832 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-log" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.090845 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" containerName="nova-kuttl-api-api" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.091717 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.094167 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.097801 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.114338 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.115737 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.117795 4842 scope.go:117] "RemoveContainer" containerID="04595c4cba0cbe1716c356e578ff6099c34682b03ff823799257cc83edc57bc3" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.118045 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.118511 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.270141 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znvxz\" (UniqueName: \"kubernetes.io/projected/5e69f281-2894-4ac6-b64b-d83754a1f246-kube-api-access-znvxz\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.270189 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43541093-2b42-4aa7-8371-c1255581d7ee-logs\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.270216 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69f281-2894-4ac6-b64b-d83754a1f246-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.270244 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx9cm\" (UniqueName: \"kubernetes.io/projected/43541093-2b42-4aa7-8371-c1255581d7ee-kube-api-access-kx9cm\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.270327 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e69f281-2894-4ac6-b64b-d83754a1f246-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.270356 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43541093-2b42-4aa7-8371-c1255581d7ee-config-data\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.371443 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znvxz\" (UniqueName: \"kubernetes.io/projected/5e69f281-2894-4ac6-b64b-d83754a1f246-kube-api-access-znvxz\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.371685 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43541093-2b42-4aa7-8371-c1255581d7ee-logs\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.371715 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69f281-2894-4ac6-b64b-d83754a1f246-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.371750 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx9cm\" (UniqueName: \"kubernetes.io/projected/43541093-2b42-4aa7-8371-c1255581d7ee-kube-api-access-kx9cm\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.371774 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e69f281-2894-4ac6-b64b-d83754a1f246-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.371797 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43541093-2b42-4aa7-8371-c1255581d7ee-config-data\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.372856 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43541093-2b42-4aa7-8371-c1255581d7ee-logs\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.373207 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e69f281-2894-4ac6-b64b-d83754a1f246-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.377222 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43541093-2b42-4aa7-8371-c1255581d7ee-config-data\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.377656 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69f281-2894-4ac6-b64b-d83754a1f246-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.396640 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx9cm\" (UniqueName: \"kubernetes.io/projected/43541093-2b42-4aa7-8371-c1255581d7ee-kube-api-access-kx9cm\") pod \"nova-kuttl-api-0\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.396784 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znvxz\" (UniqueName: \"kubernetes.io/projected/5e69f281-2894-4ac6-b64b-d83754a1f246-kube-api-access-znvxz\") pod \"nova-kuttl-metadata-0\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.437453 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.452359 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.671472 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: W0311 19:15:41.671895 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43541093_2b42_4aa7_8371_c1255581d7ee.slice/crio-b7d64249eed4a91ae5ca8037f86c376d9846218ae3971c3c23ea12a9e101acd5 WatchSource:0}: Error finding container b7d64249eed4a91ae5ca8037f86c376d9846218ae3971c3c23ea12a9e101acd5: Status 404 returned error can't find the container with id b7d64249eed4a91ae5ca8037f86c376d9846218ae3971c3c23ea12a9e101acd5 Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.733331 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:15:41 crc kubenswrapper[4842]: W0311 19:15:41.738980 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e69f281_2894_4ac6_b64b_d83754a1f246.slice/crio-84e071513e88cfaeb8274181f66502557f35b645b0caede4675e183dc423ae82 WatchSource:0}: Error finding container 84e071513e88cfaeb8274181f66502557f35b645b0caede4675e183dc423ae82: Status 404 returned error can't find the container with id 84e071513e88cfaeb8274181f66502557f35b645b0caede4675e183dc423ae82 Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.963679 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:15:41 crc kubenswrapper[4842]: E0311 19:15:41.964261 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.977033 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7428f2c1-f985-4599-95dc-64e530030eb0","Type":"ContainerStarted","Data":"b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509"} Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.977068 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7428f2c1-f985-4599-95dc-64e530030eb0","Type":"ContainerStarted","Data":"2279ee4aef429b8e727a9115c793a560a1bd7a5aa9734c56aaf1750fa450f0a6"} Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.982180 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"43541093-2b42-4aa7-8371-c1255581d7ee","Type":"ContainerStarted","Data":"1eb1bb479f5c0abf3a7a874550e727c46e159730520693b2909310cc1ea503e4"} Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.982216 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"43541093-2b42-4aa7-8371-c1255581d7ee","Type":"ContainerStarted","Data":"b7d64249eed4a91ae5ca8037f86c376d9846218ae3971c3c23ea12a9e101acd5"} Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.984720 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5e69f281-2894-4ac6-b64b-d83754a1f246","Type":"ContainerStarted","Data":"4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588"} Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.984744 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5e69f281-2894-4ac6-b64b-d83754a1f246","Type":"ContainerStarted","Data":"84e071513e88cfaeb8274181f66502557f35b645b0caede4675e183dc423ae82"} Mar 11 19:15:41 crc kubenswrapper[4842]: I0311 19:15:41.993506 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=1.993494622 podStartE2EDuration="1.993494622s" podCreationTimestamp="2026-03-11 19:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:41.991864539 +0000 UTC m=+1587.639560829" watchObservedRunningTime="2026-03-11 19:15:41.993494622 +0000 UTC m=+1587.641190902" Mar 11 19:15:42 crc kubenswrapper[4842]: I0311 19:15:42.972446 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ef4962d-9955-4c1f-974b-3c684f014905" path="/var/lib/kubelet/pods/0ef4962d-9955-4c1f-974b-3c684f014905/volumes" Mar 11 19:15:42 crc kubenswrapper[4842]: I0311 19:15:42.973352 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b555378-334c-49fb-ae66-aaea28f87bc3" path="/var/lib/kubelet/pods/1b555378-334c-49fb-ae66-aaea28f87bc3/volumes" Mar 11 19:15:42 crc kubenswrapper[4842]: I0311 19:15:42.998507 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"43541093-2b42-4aa7-8371-c1255581d7ee","Type":"ContainerStarted","Data":"1374e0770149dc741b6795bacd9dd8ea59ba4a3e3a5f5f0bf2b4359e52207329"} Mar 11 19:15:43 crc kubenswrapper[4842]: I0311 19:15:43.001594 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5e69f281-2894-4ac6-b64b-d83754a1f246","Type":"ContainerStarted","Data":"4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de"} Mar 11 19:15:43 crc kubenswrapper[4842]: I0311 19:15:43.020884 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.020849918 podStartE2EDuration="2.020849918s" podCreationTimestamp="2026-03-11 19:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:43.018462325 +0000 UTC m=+1588.666158635" watchObservedRunningTime="2026-03-11 19:15:43.020849918 +0000 UTC m=+1588.668546208" Mar 11 19:15:43 crc kubenswrapper[4842]: I0311 19:15:43.050742 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=3.050722314 podStartE2EDuration="3.050722314s" podCreationTimestamp="2026-03-11 19:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:15:43.041734028 +0000 UTC m=+1588.689430318" watchObservedRunningTime="2026-03-11 19:15:43.050722314 +0000 UTC m=+1588.698418594" Mar 11 19:15:44 crc kubenswrapper[4842]: I0311 19:15:44.927012 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b6tqt"] Mar 11 19:15:44 crc kubenswrapper[4842]: I0311 19:15:44.929417 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:44 crc kubenswrapper[4842]: I0311 19:15:44.953572 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6tqt"] Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.043542 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fbd\" (UniqueName: \"kubernetes.io/projected/593dd97c-23ef-4142-acea-ce5bb8589b2d-kube-api-access-q8fbd\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.045705 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-catalog-content\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.045836 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-utilities\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.147063 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-utilities\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.147125 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fbd\" (UniqueName: \"kubernetes.io/projected/593dd97c-23ef-4142-acea-ce5bb8589b2d-kube-api-access-q8fbd\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.147248 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-catalog-content\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.147737 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-catalog-content\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.147874 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-utilities\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.174494 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fbd\" (UniqueName: \"kubernetes.io/projected/593dd97c-23ef-4142-acea-ce5bb8589b2d-kube-api-access-q8fbd\") pod \"community-operators-b6tqt\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.314629 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.476742 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:45 crc kubenswrapper[4842]: I0311 19:15:45.809739 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6tqt"] Mar 11 19:15:46 crc kubenswrapper[4842]: I0311 19:15:46.047603 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerStarted","Data":"e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea"} Mar 11 19:15:46 crc kubenswrapper[4842]: I0311 19:15:46.047663 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerStarted","Data":"86f26e15a61e2abb2244e9461b68f3a8827ac0e9c25a2fc9a46b2101e3776c9c"} Mar 11 19:15:47 crc kubenswrapper[4842]: I0311 19:15:47.060920 4842 generic.go:334] "Generic (PLEG): container finished" podID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerID="e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea" exitCode=0 Mar 11 19:15:47 crc kubenswrapper[4842]: I0311 19:15:47.061193 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerDied","Data":"e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea"} Mar 11 19:15:48 crc kubenswrapper[4842]: I0311 19:15:48.077083 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerStarted","Data":"f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5"} Mar 11 19:15:49 crc kubenswrapper[4842]: I0311 19:15:49.090240 4842 generic.go:334] "Generic (PLEG): container finished" podID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerID="f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5" exitCode=0 Mar 11 19:15:49 crc kubenswrapper[4842]: I0311 19:15:49.090313 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerDied","Data":"f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5"} Mar 11 19:15:50 crc kubenswrapper[4842]: I0311 19:15:50.114439 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerStarted","Data":"450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e"} Mar 11 19:15:50 crc kubenswrapper[4842]: I0311 19:15:50.152504 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b6tqt" podStartSLOduration=3.707817975 podStartE2EDuration="6.152477254s" podCreationTimestamp="2026-03-11 19:15:44 +0000 UTC" firstStartedPulling="2026-03-11 19:15:47.063516776 +0000 UTC m=+1592.711213056" lastFinishedPulling="2026-03-11 19:15:49.508176045 +0000 UTC m=+1595.155872335" observedRunningTime="2026-03-11 19:15:50.137031378 +0000 UTC m=+1595.784727678" watchObservedRunningTime="2026-03-11 19:15:50.152477254 +0000 UTC m=+1595.800173544" Mar 11 19:15:50 crc kubenswrapper[4842]: I0311 19:15:50.476080 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:50 crc kubenswrapper[4842]: I0311 19:15:50.509444 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:51 crc kubenswrapper[4842]: I0311 19:15:51.155048 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:15:51 crc kubenswrapper[4842]: I0311 19:15:51.437707 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:51 crc kubenswrapper[4842]: I0311 19:15:51.437765 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:51 crc kubenswrapper[4842]: I0311 19:15:51.453490 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:51 crc kubenswrapper[4842]: I0311 19:15:51.453567 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:52 crc kubenswrapper[4842]: I0311 19:15:52.479657 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.177:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:15:52 crc kubenswrapper[4842]: I0311 19:15:52.602525 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.177:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:15:52 crc kubenswrapper[4842]: I0311 19:15:52.602801 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:15:52 crc kubenswrapper[4842]: I0311 19:15:52.602835 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:15:54 crc kubenswrapper[4842]: I0311 19:15:54.972925 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:15:54 crc kubenswrapper[4842]: E0311 19:15:54.973771 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:15:55 crc kubenswrapper[4842]: I0311 19:15:55.315688 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:55 crc kubenswrapper[4842]: I0311 19:15:55.316088 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:55 crc kubenswrapper[4842]: I0311 19:15:55.386835 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:56 crc kubenswrapper[4842]: I0311 19:15:56.234402 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:56 crc kubenswrapper[4842]: I0311 19:15:56.298974 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6tqt"] Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.194922 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b6tqt" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="registry-server" containerID="cri-o://450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e" gracePeriod=2 Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.661046 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.702085 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-utilities\") pod \"593dd97c-23ef-4142-acea-ce5bb8589b2d\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.702331 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-catalog-content\") pod \"593dd97c-23ef-4142-acea-ce5bb8589b2d\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.702371 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fbd\" (UniqueName: \"kubernetes.io/projected/593dd97c-23ef-4142-acea-ce5bb8589b2d-kube-api-access-q8fbd\") pod \"593dd97c-23ef-4142-acea-ce5bb8589b2d\" (UID: \"593dd97c-23ef-4142-acea-ce5bb8589b2d\") " Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.703655 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-utilities" (OuterVolumeSpecName: "utilities") pod "593dd97c-23ef-4142-acea-ce5bb8589b2d" (UID: "593dd97c-23ef-4142-acea-ce5bb8589b2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.709166 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593dd97c-23ef-4142-acea-ce5bb8589b2d-kube-api-access-q8fbd" (OuterVolumeSpecName: "kube-api-access-q8fbd") pod "593dd97c-23ef-4142-acea-ce5bb8589b2d" (UID: "593dd97c-23ef-4142-acea-ce5bb8589b2d"). InnerVolumeSpecName "kube-api-access-q8fbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.804600 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8fbd\" (UniqueName: \"kubernetes.io/projected/593dd97c-23ef-4142-acea-ce5bb8589b2d-kube-api-access-q8fbd\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:58 crc kubenswrapper[4842]: I0311 19:15:58.804633 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.210980 4842 generic.go:334] "Generic (PLEG): container finished" podID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerID="450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e" exitCode=0 Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.211024 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerDied","Data":"450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e"} Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.211052 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6tqt" event={"ID":"593dd97c-23ef-4142-acea-ce5bb8589b2d","Type":"ContainerDied","Data":"86f26e15a61e2abb2244e9461b68f3a8827ac0e9c25a2fc9a46b2101e3776c9c"} Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.211055 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6tqt" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.211070 4842 scope.go:117] "RemoveContainer" containerID="450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.237220 4842 scope.go:117] "RemoveContainer" containerID="f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.275472 4842 scope.go:117] "RemoveContainer" containerID="e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.300377 4842 scope.go:117] "RemoveContainer" containerID="450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e" Mar 11 19:15:59 crc kubenswrapper[4842]: E0311 19:15:59.300785 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e\": container with ID starting with 450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e not found: ID does not exist" containerID="450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.300821 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e"} err="failed to get container status \"450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e\": rpc error: code = NotFound desc = could not find container \"450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e\": container with ID starting with 450d03d57fc5e423ece669ee74d67593f5d99c6f9a21d692240c9eabcf3c457e not found: ID does not exist" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.300848 4842 scope.go:117] "RemoveContainer" containerID="f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5" Mar 11 19:15:59 crc kubenswrapper[4842]: E0311 19:15:59.301458 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5\": container with ID starting with f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5 not found: ID does not exist" containerID="f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.301484 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5"} err="failed to get container status \"f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5\": rpc error: code = NotFound desc = could not find container \"f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5\": container with ID starting with f39acd8393bb653505626f55c388acc5d6250a0f1ab49ae2b323f1570d74f7e5 not found: ID does not exist" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.301507 4842 scope.go:117] "RemoveContainer" containerID="e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea" Mar 11 19:15:59 crc kubenswrapper[4842]: E0311 19:15:59.301768 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea\": container with ID starting with e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea not found: ID does not exist" containerID="e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.302239 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea"} err="failed to get container status \"e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea\": rpc error: code = NotFound desc = could not find container \"e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea\": container with ID starting with e21c078b26356c7b8a1cc26f1f8e7da834fe0ae75915e9e6a785723670eb02ea not found: ID does not exist" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.369327 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "593dd97c-23ef-4142-acea-ce5bb8589b2d" (UID: "593dd97c-23ef-4142-acea-ce5bb8589b2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.418682 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/593dd97c-23ef-4142-acea-ce5bb8589b2d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.437798 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.437849 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.453292 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.453333 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.546612 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6tqt"] Mar 11 19:15:59 crc kubenswrapper[4842]: I0311 19:15:59.554135 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b6tqt"] Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.159590 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554276-9xs5k"] Mar 11 19:16:00 crc kubenswrapper[4842]: E0311 19:16:00.160220 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="extract-utilities" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.160370 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="extract-utilities" Mar 11 19:16:00 crc kubenswrapper[4842]: E0311 19:16:00.160455 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="extract-content" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.160516 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="extract-content" Mar 11 19:16:00 crc kubenswrapper[4842]: E0311 19:16:00.160580 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="registry-server" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.160637 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="registry-server" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.160842 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" containerName="registry-server" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.161460 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554276-9xs5k" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.163950 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.164600 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.167174 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.171590 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554276-9xs5k"] Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.232772 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmrz\" (UniqueName: \"kubernetes.io/projected/99cb0c62-5098-459d-b8e2-290d05c30b60-kube-api-access-4hmrz\") pod \"auto-csr-approver-29554276-9xs5k\" (UID: \"99cb0c62-5098-459d-b8e2-290d05c30b60\") " pod="openshift-infra/auto-csr-approver-29554276-9xs5k" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.334798 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmrz\" (UniqueName: \"kubernetes.io/projected/99cb0c62-5098-459d-b8e2-290d05c30b60-kube-api-access-4hmrz\") pod \"auto-csr-approver-29554276-9xs5k\" (UID: \"99cb0c62-5098-459d-b8e2-290d05c30b60\") " pod="openshift-infra/auto-csr-approver-29554276-9xs5k" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.367405 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmrz\" (UniqueName: \"kubernetes.io/projected/99cb0c62-5098-459d-b8e2-290d05c30b60-kube-api-access-4hmrz\") pod \"auto-csr-approver-29554276-9xs5k\" (UID: \"99cb0c62-5098-459d-b8e2-290d05c30b60\") " pod="openshift-infra/auto-csr-approver-29554276-9xs5k" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.487534 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554276-9xs5k" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.975891 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593dd97c-23ef-4142-acea-ce5bb8589b2d" path="/var/lib/kubelet/pods/593dd97c-23ef-4142-acea-ce5bb8589b2d/volumes" Mar 11 19:16:00 crc kubenswrapper[4842]: I0311 19:16:00.977376 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554276-9xs5k"] Mar 11 19:16:00 crc kubenswrapper[4842]: W0311 19:16:00.977623 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99cb0c62_5098_459d_b8e2_290d05c30b60.slice/crio-8c919898318e9e2c137c7e8e7b1ee40e5c7254e9e2ebbb78404d1385aeecb135 WatchSource:0}: Error finding container 8c919898318e9e2c137c7e8e7b1ee40e5c7254e9e2ebbb78404d1385aeecb135: Status 404 returned error can't find the container with id 8c919898318e9e2c137c7e8e7b1ee40e5c7254e9e2ebbb78404d1385aeecb135 Mar 11 19:16:01 crc kubenswrapper[4842]: I0311 19:16:01.229987 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554276-9xs5k" event={"ID":"99cb0c62-5098-459d-b8e2-290d05c30b60","Type":"ContainerStarted","Data":"8c919898318e9e2c137c7e8e7b1ee40e5c7254e9e2ebbb78404d1385aeecb135"} Mar 11 19:16:01 crc kubenswrapper[4842]: I0311 19:16:01.441858 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:01 crc kubenswrapper[4842]: I0311 19:16:01.442802 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:01 crc kubenswrapper[4842]: I0311 19:16:01.445553 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:01 crc kubenswrapper[4842]: I0311 19:16:01.457962 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:01 crc kubenswrapper[4842]: I0311 19:16:01.462943 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:01 crc kubenswrapper[4842]: I0311 19:16:01.463799 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:02 crc kubenswrapper[4842]: I0311 19:16:02.248683 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:02 crc kubenswrapper[4842]: I0311 19:16:02.256946 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:03 crc kubenswrapper[4842]: I0311 19:16:03.251796 4842 generic.go:334] "Generic (PLEG): container finished" podID="99cb0c62-5098-459d-b8e2-290d05c30b60" containerID="69dc0bc64cdcdcb52600f26a1d6456519d9859ba5deb19d9d6b4a2068d54b12e" exitCode=0 Mar 11 19:16:03 crc kubenswrapper[4842]: I0311 19:16:03.251857 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554276-9xs5k" event={"ID":"99cb0c62-5098-459d-b8e2-290d05c30b60","Type":"ContainerDied","Data":"69dc0bc64cdcdcb52600f26a1d6456519d9859ba5deb19d9d6b4a2068d54b12e"} Mar 11 19:16:04 crc kubenswrapper[4842]: I0311 19:16:04.573967 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554276-9xs5k" Mar 11 19:16:04 crc kubenswrapper[4842]: I0311 19:16:04.602893 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmrz\" (UniqueName: \"kubernetes.io/projected/99cb0c62-5098-459d-b8e2-290d05c30b60-kube-api-access-4hmrz\") pod \"99cb0c62-5098-459d-b8e2-290d05c30b60\" (UID: \"99cb0c62-5098-459d-b8e2-290d05c30b60\") " Mar 11 19:16:04 crc kubenswrapper[4842]: I0311 19:16:04.608877 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99cb0c62-5098-459d-b8e2-290d05c30b60-kube-api-access-4hmrz" (OuterVolumeSpecName: "kube-api-access-4hmrz") pod "99cb0c62-5098-459d-b8e2-290d05c30b60" (UID: "99cb0c62-5098-459d-b8e2-290d05c30b60"). InnerVolumeSpecName "kube-api-access-4hmrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:04 crc kubenswrapper[4842]: I0311 19:16:04.705029 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmrz\" (UniqueName: \"kubernetes.io/projected/99cb0c62-5098-459d-b8e2-290d05c30b60-kube-api-access-4hmrz\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:05 crc kubenswrapper[4842]: I0311 19:16:05.269747 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554276-9xs5k" event={"ID":"99cb0c62-5098-459d-b8e2-290d05c30b60","Type":"ContainerDied","Data":"8c919898318e9e2c137c7e8e7b1ee40e5c7254e9e2ebbb78404d1385aeecb135"} Mar 11 19:16:05 crc kubenswrapper[4842]: I0311 19:16:05.269785 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c919898318e9e2c137c7e8e7b1ee40e5c7254e9e2ebbb78404d1385aeecb135" Mar 11 19:16:05 crc kubenswrapper[4842]: I0311 19:16:05.269818 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554276-9xs5k" Mar 11 19:16:05 crc kubenswrapper[4842]: I0311 19:16:05.647677 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554270-4phpn"] Mar 11 19:16:05 crc kubenswrapper[4842]: I0311 19:16:05.658007 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554270-4phpn"] Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.051359 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.051647 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="a208ab67-07bd-43b7-8ec5-1408824103b8" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.079662 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.079913 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-log" containerID="cri-o://1eb1bb479f5c0abf3a7a874550e727c46e159730520693b2909310cc1ea503e4" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.080032 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-api" containerID="cri-o://1374e0770149dc741b6795bacd9dd8ea59ba4a3e3a5f5f0bf2b4359e52207329" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.158331 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.158635 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="7428f2c1-f985-4599-95dc-64e530030eb0" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.169718 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.169925 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" containerName="nova-kuttl-cell1-novncproxy-novncproxy" containerID="cri-o://31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.179473 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.179725 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.179926 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-log" containerID="cri-o://4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.248137 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.248361 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="a2088551-e0e8-474b-b472-63b60ca972c0" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://586fd9696e7e5d4b8dba356243d03e667d6737a2bbb02c29994a7397c74731da" gracePeriod=30 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.283601 4842 generic.go:334] "Generic (PLEG): container finished" podID="43541093-2b42-4aa7-8371-c1255581d7ee" containerID="1eb1bb479f5c0abf3a7a874550e727c46e159730520693b2909310cc1ea503e4" exitCode=143 Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.283644 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"43541093-2b42-4aa7-8371-c1255581d7ee","Type":"ContainerDied","Data":"1eb1bb479f5c0abf3a7a874550e727c46e159730520693b2909310cc1ea503e4"} Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.966149 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:06 crc kubenswrapper[4842]: I0311 19:16:06.977652 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf" path="/var/lib/kubelet/pods/9e3b5fd4-ea6b-4622-b97a-018a6f8bcedf/volumes" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.049850 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-config-data\") pod \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.050148 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5259\" (UniqueName: \"kubernetes.io/projected/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-kube-api-access-p5259\") pod \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\" (UID: \"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239\") " Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.054555 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-kube-api-access-p5259" (OuterVolumeSpecName: "kube-api-access-p5259") pod "e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" (UID: "e8c1be6f-1e2e-40e1-95c6-831ccd3a9239"). InnerVolumeSpecName "kube-api-access-p5259". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.071912 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-config-data" (OuterVolumeSpecName: "config-data") pod "e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" (UID: "e8c1be6f-1e2e-40e1-95c6-831ccd3a9239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.151514 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.151563 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5259\" (UniqueName: \"kubernetes.io/projected/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239-kube-api-access-p5259\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.293367 4842 generic.go:334] "Generic (PLEG): container finished" podID="e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" containerID="31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d" exitCode=0 Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.293408 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.293463 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239","Type":"ContainerDied","Data":"31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d"} Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.293538 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"e8c1be6f-1e2e-40e1-95c6-831ccd3a9239","Type":"ContainerDied","Data":"cf71cb2203d591d7a8cafe334a8a007f576ff46b625747c572b090c668d5530f"} Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.293568 4842 scope.go:117] "RemoveContainer" containerID="31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.300610 4842 generic.go:334] "Generic (PLEG): container finished" podID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerID="4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588" exitCode=143 Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.300659 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5e69f281-2894-4ac6-b64b-d83754a1f246","Type":"ContainerDied","Data":"4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588"} Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.326361 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.329982 4842 scope.go:117] "RemoveContainer" containerID="31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d" Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.330437 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d\": container with ID starting with 31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d not found: ID does not exist" containerID="31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.330477 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d"} err="failed to get container status \"31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d\": rpc error: code = NotFound desc = could not find container \"31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d\": container with ID starting with 31341210e4c0e11b3d87dfe1192633123f33ea31e759fb72b6368db2e7723d7d not found: ID does not exist" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.336095 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.357734 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.358344 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.358521 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.358590 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cb0c62-5098-459d-b8e2-290d05c30b60" containerName="oc" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.358609 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cb0c62-5098-459d-b8e2-290d05c30b60" containerName="oc" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.358993 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.359076 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cb0c62-5098-459d-b8e2-290d05c30b60" containerName="oc" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.360336 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.363874 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-novncproxy-config-data" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.364339 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.557521 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7mzw\" (UniqueName: \"kubernetes.io/projected/84f0d804-f474-49e5-8644-29e6adeb8e17-kube-api-access-q7mzw\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.557896 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f0d804-f474-49e5-8644-29e6adeb8e17-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.661774 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7mzw\" (UniqueName: \"kubernetes.io/projected/84f0d804-f474-49e5-8644-29e6adeb8e17-kube-api-access-q7mzw\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.662479 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f0d804-f474-49e5-8644-29e6adeb8e17-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.669014 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f0d804-f474-49e5-8644-29e6adeb8e17-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.732093 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7mzw\" (UniqueName: \"kubernetes.io/projected/84f0d804-f474-49e5-8644-29e6adeb8e17-kube-api-access-q7mzw\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.831537 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.833287 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.834954 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.835074 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="a208ab67-07bd-43b7-8ec5-1408824103b8" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.962117 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:16:07 crc kubenswrapper[4842]: E0311 19:16:07.962623 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.979973 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:07 crc kubenswrapper[4842]: I0311 19:16:07.988218 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.069476 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7428f2c1-f985-4599-95dc-64e530030eb0-config-data\") pod \"7428f2c1-f985-4599-95dc-64e530030eb0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.069894 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhpts\" (UniqueName: \"kubernetes.io/projected/7428f2c1-f985-4599-95dc-64e530030eb0-kube-api-access-qhpts\") pod \"7428f2c1-f985-4599-95dc-64e530030eb0\" (UID: \"7428f2c1-f985-4599-95dc-64e530030eb0\") " Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.077074 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7428f2c1-f985-4599-95dc-64e530030eb0-kube-api-access-qhpts" (OuterVolumeSpecName: "kube-api-access-qhpts") pod "7428f2c1-f985-4599-95dc-64e530030eb0" (UID: "7428f2c1-f985-4599-95dc-64e530030eb0"). InnerVolumeSpecName "kube-api-access-qhpts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.103427 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7428f2c1-f985-4599-95dc-64e530030eb0-config-data" (OuterVolumeSpecName: "config-data") pod "7428f2c1-f985-4599-95dc-64e530030eb0" (UID: "7428f2c1-f985-4599-95dc-64e530030eb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.171807 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7428f2c1-f985-4599-95dc-64e530030eb0-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.171848 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhpts\" (UniqueName: \"kubernetes.io/projected/7428f2c1-f985-4599-95dc-64e530030eb0-kube-api-access-qhpts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.309975 4842 generic.go:334] "Generic (PLEG): container finished" podID="7428f2c1-f985-4599-95dc-64e530030eb0" containerID="b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509" exitCode=0 Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.310155 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7428f2c1-f985-4599-95dc-64e530030eb0","Type":"ContainerDied","Data":"b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509"} Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.310357 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7428f2c1-f985-4599-95dc-64e530030eb0","Type":"ContainerDied","Data":"2279ee4aef429b8e727a9115c793a560a1bd7a5aa9734c56aaf1750fa450f0a6"} Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.310200 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.310483 4842 scope.go:117] "RemoveContainer" containerID="b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.341971 4842 scope.go:117] "RemoveContainer" containerID="b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.342616 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:08 crc kubenswrapper[4842]: E0311 19:16:08.342927 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509\": container with ID starting with b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509 not found: ID does not exist" containerID="b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.342960 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509"} err="failed to get container status \"b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509\": rpc error: code = NotFound desc = could not find container \"b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509\": container with ID starting with b5e15c18554717c2a901460c9236be6bdeebaa5dcca4d2dbf967a1833c6ee509 not found: ID does not exist" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.349905 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.369971 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:08 crc kubenswrapper[4842]: E0311 19:16:08.370348 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7428f2c1-f985-4599-95dc-64e530030eb0" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.370364 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7428f2c1-f985-4599-95dc-64e530030eb0" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.370510 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7428f2c1-f985-4599-95dc-64e530030eb0" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.371142 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.373998 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.374142 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlhsh\" (UniqueName: \"kubernetes.io/projected/e51027d9-781b-41f6-93af-8bf32501bc63-kube-api-access-mlhsh\") pod \"nova-kuttl-scheduler-0\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.374248 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51027d9-781b-41f6-93af-8bf32501bc63-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.379365 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.432403 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.482656 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51027d9-781b-41f6-93af-8bf32501bc63-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.483362 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlhsh\" (UniqueName: \"kubernetes.io/projected/e51027d9-781b-41f6-93af-8bf32501bc63-kube-api-access-mlhsh\") pod \"nova-kuttl-scheduler-0\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.494125 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51027d9-781b-41f6-93af-8bf32501bc63-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.508262 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlhsh\" (UniqueName: \"kubernetes.io/projected/e51027d9-781b-41f6-93af-8bf32501bc63-kube-api-access-mlhsh\") pod \"nova-kuttl-scheduler-0\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.702673 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.972977 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7428f2c1-f985-4599-95dc-64e530030eb0" path="/var/lib/kubelet/pods/7428f2c1-f985-4599-95dc-64e530030eb0/volumes" Mar 11 19:16:08 crc kubenswrapper[4842]: I0311 19:16:08.973872 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c1be6f-1e2e-40e1-95c6-831ccd3a9239" path="/var/lib/kubelet/pods/e8c1be6f-1e2e-40e1-95c6-831ccd3a9239/volumes" Mar 11 19:16:09 crc kubenswrapper[4842]: I0311 19:16:09.159008 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:09 crc kubenswrapper[4842]: W0311 19:16:09.161995 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode51027d9_781b_41f6_93af_8bf32501bc63.slice/crio-6ac2a8f79d7053f55ccb6a0ca082ab396a97cb21dea1185b1b4588ca80d3d1d3 WatchSource:0}: Error finding container 6ac2a8f79d7053f55ccb6a0ca082ab396a97cb21dea1185b1b4588ca80d3d1d3: Status 404 returned error can't find the container with id 6ac2a8f79d7053f55ccb6a0ca082ab396a97cb21dea1185b1b4588ca80d3d1d3 Mar 11 19:16:09 crc kubenswrapper[4842]: I0311 19:16:09.321486 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"84f0d804-f474-49e5-8644-29e6adeb8e17","Type":"ContainerStarted","Data":"d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb"} Mar 11 19:16:09 crc kubenswrapper[4842]: I0311 19:16:09.321854 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"84f0d804-f474-49e5-8644-29e6adeb8e17","Type":"ContainerStarted","Data":"c63dc36577745f63980c48b26413a525a3372d4841fd19c3bdfa6ff4b1a7a64c"} Mar 11 19:16:09 crc kubenswrapper[4842]: I0311 19:16:09.326167 4842 generic.go:334] "Generic (PLEG): container finished" podID="43541093-2b42-4aa7-8371-c1255581d7ee" containerID="1374e0770149dc741b6795bacd9dd8ea59ba4a3e3a5f5f0bf2b4359e52207329" exitCode=0 Mar 11 19:16:09 crc kubenswrapper[4842]: I0311 19:16:09.326256 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"43541093-2b42-4aa7-8371-c1255581d7ee","Type":"ContainerDied","Data":"1374e0770149dc741b6795bacd9dd8ea59ba4a3e3a5f5f0bf2b4359e52207329"} Mar 11 19:16:09 crc kubenswrapper[4842]: I0311 19:16:09.327520 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"e51027d9-781b-41f6-93af-8bf32501bc63","Type":"ContainerStarted","Data":"6ac2a8f79d7053f55ccb6a0ca082ab396a97cb21dea1185b1b4588ca80d3d1d3"} Mar 11 19:16:09 crc kubenswrapper[4842]: I0311 19:16:09.349142 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podStartSLOduration=2.349118311 podStartE2EDuration="2.349118311s" podCreationTimestamp="2026-03-11 19:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:09.339236271 +0000 UTC m=+1614.986932551" watchObservedRunningTime="2026-03-11 19:16:09.349118311 +0000 UTC m=+1614.996814601" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.625407 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.681697 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.708230 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znvxz\" (UniqueName: \"kubernetes.io/projected/5e69f281-2894-4ac6-b64b-d83754a1f246-kube-api-access-znvxz\") pod \"5e69f281-2894-4ac6-b64b-d83754a1f246\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.708299 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69f281-2894-4ac6-b64b-d83754a1f246-config-data\") pod \"5e69f281-2894-4ac6-b64b-d83754a1f246\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.708326 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx9cm\" (UniqueName: \"kubernetes.io/projected/43541093-2b42-4aa7-8371-c1255581d7ee-kube-api-access-kx9cm\") pod \"43541093-2b42-4aa7-8371-c1255581d7ee\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.708390 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43541093-2b42-4aa7-8371-c1255581d7ee-logs\") pod \"43541093-2b42-4aa7-8371-c1255581d7ee\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.708428 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e69f281-2894-4ac6-b64b-d83754a1f246-logs\") pod \"5e69f281-2894-4ac6-b64b-d83754a1f246\" (UID: \"5e69f281-2894-4ac6-b64b-d83754a1f246\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.708478 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43541093-2b42-4aa7-8371-c1255581d7ee-config-data\") pod \"43541093-2b42-4aa7-8371-c1255581d7ee\" (UID: \"43541093-2b42-4aa7-8371-c1255581d7ee\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.711740 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43541093-2b42-4aa7-8371-c1255581d7ee-logs" (OuterVolumeSpecName: "logs") pod "43541093-2b42-4aa7-8371-c1255581d7ee" (UID: "43541093-2b42-4aa7-8371-c1255581d7ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.711777 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e69f281-2894-4ac6-b64b-d83754a1f246-logs" (OuterVolumeSpecName: "logs") pod "5e69f281-2894-4ac6-b64b-d83754a1f246" (UID: "5e69f281-2894-4ac6-b64b-d83754a1f246"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.715333 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e69f281-2894-4ac6-b64b-d83754a1f246-kube-api-access-znvxz" (OuterVolumeSpecName: "kube-api-access-znvxz") pod "5e69f281-2894-4ac6-b64b-d83754a1f246" (UID: "5e69f281-2894-4ac6-b64b-d83754a1f246"). InnerVolumeSpecName "kube-api-access-znvxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.716133 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43541093-2b42-4aa7-8371-c1255581d7ee-kube-api-access-kx9cm" (OuterVolumeSpecName: "kube-api-access-kx9cm") pod "43541093-2b42-4aa7-8371-c1255581d7ee" (UID: "43541093-2b42-4aa7-8371-c1255581d7ee"). InnerVolumeSpecName "kube-api-access-kx9cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.734201 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43541093-2b42-4aa7-8371-c1255581d7ee-config-data" (OuterVolumeSpecName: "config-data") pod "43541093-2b42-4aa7-8371-c1255581d7ee" (UID: "43541093-2b42-4aa7-8371-c1255581d7ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.747670 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e69f281-2894-4ac6-b64b-d83754a1f246-config-data" (OuterVolumeSpecName: "config-data") pod "5e69f281-2894-4ac6-b64b-d83754a1f246" (UID: "5e69f281-2894-4ac6-b64b-d83754a1f246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.810309 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znvxz\" (UniqueName: \"kubernetes.io/projected/5e69f281-2894-4ac6-b64b-d83754a1f246-kube-api-access-znvxz\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.810349 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69f281-2894-4ac6-b64b-d83754a1f246-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.810362 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx9cm\" (UniqueName: \"kubernetes.io/projected/43541093-2b42-4aa7-8371-c1255581d7ee-kube-api-access-kx9cm\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.810373 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43541093-2b42-4aa7-8371-c1255581d7ee-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.810383 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e69f281-2894-4ac6-b64b-d83754a1f246-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:09.810392 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43541093-2b42-4aa7-8371-c1255581d7ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.337461 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"e51027d9-781b-41f6-93af-8bf32501bc63","Type":"ContainerStarted","Data":"9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a"} Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.342536 4842 generic.go:334] "Generic (PLEG): container finished" podID="a2088551-e0e8-474b-b472-63b60ca972c0" containerID="586fd9696e7e5d4b8dba356243d03e667d6737a2bbb02c29994a7397c74731da" exitCode=0 Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.342591 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"a2088551-e0e8-474b-b472-63b60ca972c0","Type":"ContainerDied","Data":"586fd9696e7e5d4b8dba356243d03e667d6737a2bbb02c29994a7397c74731da"} Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.342613 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"a2088551-e0e8-474b-b472-63b60ca972c0","Type":"ContainerDied","Data":"4d36a51eab6e063063ac7a90cfa4349baa87a4518b4d3d52075961d71a8e6e5c"} Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.342625 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d36a51eab6e063063ac7a90cfa4349baa87a4518b4d3d52075961d71a8e6e5c" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.344619 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"43541093-2b42-4aa7-8371-c1255581d7ee","Type":"ContainerDied","Data":"b7d64249eed4a91ae5ca8037f86c376d9846218ae3971c3c23ea12a9e101acd5"} Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.344654 4842 scope.go:117] "RemoveContainer" containerID="1374e0770149dc741b6795bacd9dd8ea59ba4a3e3a5f5f0bf2b4359e52207329" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.344742 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.356216 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.356196383 podStartE2EDuration="2.356196383s" podCreationTimestamp="2026-03-11 19:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:10.354438597 +0000 UTC m=+1616.002134877" watchObservedRunningTime="2026-03-11 19:16:10.356196383 +0000 UTC m=+1616.003892663" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.360017 4842 generic.go:334] "Generic (PLEG): container finished" podID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerID="4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de" exitCode=0 Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.360257 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.360735 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5e69f281-2894-4ac6-b64b-d83754a1f246","Type":"ContainerDied","Data":"4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de"} Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.360778 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5e69f281-2894-4ac6-b64b-d83754a1f246","Type":"ContainerDied","Data":"84e071513e88cfaeb8274181f66502557f35b645b0caede4675e183dc423ae82"} Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.387053 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.395699 4842 scope.go:117] "RemoveContainer" containerID="1eb1bb479f5c0abf3a7a874550e727c46e159730520693b2909310cc1ea503e4" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.419336 4842 scope.go:117] "RemoveContainer" containerID="4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.419620 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57nwb\" (UniqueName: \"kubernetes.io/projected/a2088551-e0e8-474b-b472-63b60ca972c0-kube-api-access-57nwb\") pod \"a2088551-e0e8-474b-b472-63b60ca972c0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.419690 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2088551-e0e8-474b-b472-63b60ca972c0-config-data\") pod \"a2088551-e0e8-474b-b472-63b60ca972c0\" (UID: \"a2088551-e0e8-474b-b472-63b60ca972c0\") " Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.423314 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.454337 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2088551-e0e8-474b-b472-63b60ca972c0-kube-api-access-57nwb" (OuterVolumeSpecName: "kube-api-access-57nwb") pod "a2088551-e0e8-474b-b472-63b60ca972c0" (UID: "a2088551-e0e8-474b-b472-63b60ca972c0"). InnerVolumeSpecName "kube-api-access-57nwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.481853 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2088551-e0e8-474b-b472-63b60ca972c0-config-data" (OuterVolumeSpecName: "config-data") pod "a2088551-e0e8-474b-b472-63b60ca972c0" (UID: "a2088551-e0e8-474b-b472-63b60ca972c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.481926 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.513796 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.520804 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57nwb\" (UniqueName: \"kubernetes.io/projected/a2088551-e0e8-474b-b472-63b60ca972c0-kube-api-access-57nwb\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.520837 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2088551-e0e8-474b-b472-63b60ca972c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.521993 4842 scope.go:117] "RemoveContainer" containerID="4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.527672 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536210 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: E0311 19:16:10.536577 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2088551-e0e8-474b-b472-63b60ca972c0" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536593 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2088551-e0e8-474b-b472-63b60ca972c0" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:10 crc kubenswrapper[4842]: E0311 19:16:10.536605 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-log" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536611 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-log" Mar 11 19:16:10 crc kubenswrapper[4842]: E0311 19:16:10.536629 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-log" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536635 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-log" Mar 11 19:16:10 crc kubenswrapper[4842]: E0311 19:16:10.536649 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536657 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:10 crc kubenswrapper[4842]: E0311 19:16:10.536674 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-api" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536681 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-api" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536820 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-api" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536832 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-log" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536841 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" containerName="nova-kuttl-api-log" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536849 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2088551-e0e8-474b-b472-63b60ca972c0" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.536859 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.537793 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.543691 4842 scope.go:117] "RemoveContainer" containerID="4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.543705 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:16:10 crc kubenswrapper[4842]: E0311 19:16:10.544130 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de\": container with ID starting with 4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de not found: ID does not exist" containerID="4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.544161 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de"} err="failed to get container status \"4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de\": rpc error: code = NotFound desc = could not find container \"4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de\": container with ID starting with 4f8c524172418f7f649a3d5fade048ce27f3e6d389cb98fdd0764375669481de not found: ID does not exist" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.544182 4842 scope.go:117] "RemoveContainer" containerID="4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588" Mar 11 19:16:10 crc kubenswrapper[4842]: E0311 19:16:10.544740 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588\": container with ID starting with 4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588 not found: ID does not exist" containerID="4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.544804 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588"} err="failed to get container status \"4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588\": rpc error: code = NotFound desc = could not find container \"4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588\": container with ID starting with 4f86760768867807ad3a0ca21509aeac2dcdb0d46758d3af41eaf5e131c56588 not found: ID does not exist" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.551304 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.564607 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.566063 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.568402 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.570859 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.622483 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aec138d-5919-431c-9b1a-ba46ae379e7f-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.622556 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aec138d-5919-431c-9b1a-ba46ae379e7f-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.622596 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af01486d-9b65-4e91-8ec4-107362d4988e-logs\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.622616 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfmx7\" (UniqueName: \"kubernetes.io/projected/4aec138d-5919-431c-9b1a-ba46ae379e7f-kube-api-access-rfmx7\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.622643 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n9gl\" (UniqueName: \"kubernetes.io/projected/af01486d-9b65-4e91-8ec4-107362d4988e-kube-api-access-8n9gl\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.622830 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af01486d-9b65-4e91-8ec4-107362d4988e-config-data\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.724565 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af01486d-9b65-4e91-8ec4-107362d4988e-logs\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.724627 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfmx7\" (UniqueName: \"kubernetes.io/projected/4aec138d-5919-431c-9b1a-ba46ae379e7f-kube-api-access-rfmx7\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.724667 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n9gl\" (UniqueName: \"kubernetes.io/projected/af01486d-9b65-4e91-8ec4-107362d4988e-kube-api-access-8n9gl\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.725010 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af01486d-9b65-4e91-8ec4-107362d4988e-config-data\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.725075 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af01486d-9b65-4e91-8ec4-107362d4988e-logs\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.725133 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aec138d-5919-431c-9b1a-ba46ae379e7f-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.725211 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aec138d-5919-431c-9b1a-ba46ae379e7f-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.725396 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aec138d-5919-431c-9b1a-ba46ae379e7f-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.729230 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aec138d-5919-431c-9b1a-ba46ae379e7f-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.729431 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af01486d-9b65-4e91-8ec4-107362d4988e-config-data\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.748480 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n9gl\" (UniqueName: \"kubernetes.io/projected/af01486d-9b65-4e91-8ec4-107362d4988e-kube-api-access-8n9gl\") pod \"nova-kuttl-api-0\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.754936 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfmx7\" (UniqueName: \"kubernetes.io/projected/4aec138d-5919-431c-9b1a-ba46ae379e7f-kube-api-access-rfmx7\") pod \"nova-kuttl-metadata-0\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.857830 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:10 crc kubenswrapper[4842]: I0311 19:16:10.887058 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.010773 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43541093-2b42-4aa7-8371-c1255581d7ee" path="/var/lib/kubelet/pods/43541093-2b42-4aa7-8371-c1255581d7ee/volumes" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.012234 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e69f281-2894-4ac6-b64b-d83754a1f246" path="/var/lib/kubelet/pods/5e69f281-2894-4ac6-b64b-d83754a1f246/volumes" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.335632 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.371883 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4aec138d-5919-431c-9b1a-ba46ae379e7f","Type":"ContainerStarted","Data":"43d9dd0923729eca74f57d308a8af57c61ba8b4225e1340a854443aba852711a"} Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.373295 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.405794 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.413604 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.464963 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.478735 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.478745 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.482113 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.487494 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.545181 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr88z\" (UniqueName: \"kubernetes.io/projected/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-kube-api-access-dr88z\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.545960 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.646949 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr88z\" (UniqueName: \"kubernetes.io/projected/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-kube-api-access-dr88z\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.647054 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.651528 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.662314 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr88z\" (UniqueName: \"kubernetes.io/projected/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-kube-api-access-dr88z\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.844444 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:11 crc kubenswrapper[4842]: I0311 19:16:11.956075 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.157679 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a208ab67-07bd-43b7-8ec5-1408824103b8-config-data\") pod \"a208ab67-07bd-43b7-8ec5-1408824103b8\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.158033 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlxnz\" (UniqueName: \"kubernetes.io/projected/a208ab67-07bd-43b7-8ec5-1408824103b8-kube-api-access-mlxnz\") pod \"a208ab67-07bd-43b7-8ec5-1408824103b8\" (UID: \"a208ab67-07bd-43b7-8ec5-1408824103b8\") " Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.180046 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a208ab67-07bd-43b7-8ec5-1408824103b8-config-data" (OuterVolumeSpecName: "config-data") pod "a208ab67-07bd-43b7-8ec5-1408824103b8" (UID: "a208ab67-07bd-43b7-8ec5-1408824103b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.182063 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a208ab67-07bd-43b7-8ec5-1408824103b8-kube-api-access-mlxnz" (OuterVolumeSpecName: "kube-api-access-mlxnz") pod "a208ab67-07bd-43b7-8ec5-1408824103b8" (UID: "a208ab67-07bd-43b7-8ec5-1408824103b8"). InnerVolumeSpecName "kube-api-access-mlxnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.260053 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a208ab67-07bd-43b7-8ec5-1408824103b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.260088 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlxnz\" (UniqueName: \"kubernetes.io/projected/a208ab67-07bd-43b7-8ec5-1408824103b8-kube-api-access-mlxnz\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.286715 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.384438 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"faab7cbc-3c65-4c49-bce1-14b8a0b091b1","Type":"ContainerStarted","Data":"1b0b8423f087240664d6a5fb195bfc992b6465baf1909719d7c6f146a2380ddf"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.385978 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4aec138d-5919-431c-9b1a-ba46ae379e7f","Type":"ContainerStarted","Data":"103594b46390346cb727c5f4fde4dc80b8b1836ab75e3bd90dbcf92605ca96ee"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.386022 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4aec138d-5919-431c-9b1a-ba46ae379e7f","Type":"ContainerStarted","Data":"ba7bcd3f4ed05a5a21376f0ece1119ce6b43548dbb0d361f0c2495fd30805e06"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.388889 4842 generic.go:334] "Generic (PLEG): container finished" podID="a208ab67-07bd-43b7-8ec5-1408824103b8" containerID="4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" exitCode=0 Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.388942 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"a208ab67-07bd-43b7-8ec5-1408824103b8","Type":"ContainerDied","Data":"4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.388959 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"a208ab67-07bd-43b7-8ec5-1408824103b8","Type":"ContainerDied","Data":"b5e049d5ae6248c8a4de6a9a73c751874a98ef4d18e313dc0f81119b88c2c51d"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.388956 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.388976 4842 scope.go:117] "RemoveContainer" containerID="4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.392785 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"af01486d-9b65-4e91-8ec4-107362d4988e","Type":"ContainerStarted","Data":"5f1ce1c51d846d279e12fc5511255acb62157ec7567c27130cb550e77f4e3e87"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.392888 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"af01486d-9b65-4e91-8ec4-107362d4988e","Type":"ContainerStarted","Data":"4bbac15eaca177eb9799f1a5af6ea0a73513cfefb62f4b1ae7c93b32b7fa7a4e"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.392940 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"af01486d-9b65-4e91-8ec4-107362d4988e","Type":"ContainerStarted","Data":"b91c4bd7d451e9ef983a5b06bff98df83ff6df938c82c5c3a51a8c7c79c21826"} Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.417461 4842 scope.go:117] "RemoveContainer" containerID="4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" Mar 11 19:16:12 crc kubenswrapper[4842]: E0311 19:16:12.417830 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216\": container with ID starting with 4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216 not found: ID does not exist" containerID="4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.417883 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216"} err="failed to get container status \"4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216\": rpc error: code = NotFound desc = could not find container \"4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216\": container with ID starting with 4eded2ff3743bf3cf0596a3c55dd265477d7ed51d7414bdbf25c459c07f5a216 not found: ID does not exist" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.430861 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.430839069 podStartE2EDuration="2.430839069s" podCreationTimestamp="2026-03-11 19:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:12.406850598 +0000 UTC m=+1618.054546938" watchObservedRunningTime="2026-03-11 19:16:12.430839069 +0000 UTC m=+1618.078535349" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.444898 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.444861838 podStartE2EDuration="2.444861838s" podCreationTimestamp="2026-03-11 19:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:12.429219277 +0000 UTC m=+1618.076915587" watchObservedRunningTime="2026-03-11 19:16:12.444861838 +0000 UTC m=+1618.092558118" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.458998 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.464982 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.494369 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:12 crc kubenswrapper[4842]: E0311 19:16:12.494685 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a208ab67-07bd-43b7-8ec5-1408824103b8" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.494703 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a208ab67-07bd-43b7-8ec5-1408824103b8" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.494923 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a208ab67-07bd-43b7-8ec5-1408824103b8" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.495423 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.499593 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.508970 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.567726 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673ea753-c8fe-44ff-8389-a42bd219aa40-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.567775 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncfd2\" (UniqueName: \"kubernetes.io/projected/673ea753-c8fe-44ff-8389-a42bd219aa40-kube-api-access-ncfd2\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.670399 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673ea753-c8fe-44ff-8389-a42bd219aa40-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.670649 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncfd2\" (UniqueName: \"kubernetes.io/projected/673ea753-c8fe-44ff-8389-a42bd219aa40-kube-api-access-ncfd2\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.677680 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673ea753-c8fe-44ff-8389-a42bd219aa40-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.688530 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncfd2\" (UniqueName: \"kubernetes.io/projected/673ea753-c8fe-44ff-8389-a42bd219aa40-kube-api-access-ncfd2\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.840249 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.973323 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2088551-e0e8-474b-b472-63b60ca972c0" path="/var/lib/kubelet/pods/a2088551-e0e8-474b-b472-63b60ca972c0/volumes" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.973880 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a208ab67-07bd-43b7-8ec5-1408824103b8" path="/var/lib/kubelet/pods/a208ab67-07bd-43b7-8ec5-1408824103b8/volumes" Mar 11 19:16:12 crc kubenswrapper[4842]: I0311 19:16:12.981751 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:13 crc kubenswrapper[4842]: I0311 19:16:13.334231 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:13 crc kubenswrapper[4842]: I0311 19:16:13.403785 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"673ea753-c8fe-44ff-8389-a42bd219aa40","Type":"ContainerStarted","Data":"7b4a371b390306f8cd4818fba03bdbf9dafdd467c474842cf57f919068d82e6f"} Mar 11 19:16:13 crc kubenswrapper[4842]: I0311 19:16:13.406712 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"faab7cbc-3c65-4c49-bce1-14b8a0b091b1","Type":"ContainerStarted","Data":"36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9"} Mar 11 19:16:13 crc kubenswrapper[4842]: I0311 19:16:13.408030 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:13 crc kubenswrapper[4842]: I0311 19:16:13.429649 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=2.429631874 podStartE2EDuration="2.429631874s" podCreationTimestamp="2026-03-11 19:16:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:13.421587192 +0000 UTC m=+1619.069283482" watchObservedRunningTime="2026-03-11 19:16:13.429631874 +0000 UTC m=+1619.077328154" Mar 11 19:16:13 crc kubenswrapper[4842]: I0311 19:16:13.704202 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:14 crc kubenswrapper[4842]: I0311 19:16:14.421162 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"673ea753-c8fe-44ff-8389-a42bd219aa40","Type":"ContainerStarted","Data":"aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5"} Mar 11 19:16:15 crc kubenswrapper[4842]: I0311 19:16:15.431396 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:17 crc kubenswrapper[4842]: I0311 19:16:17.981462 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:17 crc kubenswrapper[4842]: I0311 19:16:17.996807 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:18 crc kubenswrapper[4842]: I0311 19:16:18.016875 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=6.016857379 podStartE2EDuration="6.016857379s" podCreationTimestamp="2026-03-11 19:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:14.444988864 +0000 UTC m=+1620.092685144" watchObservedRunningTime="2026-03-11 19:16:18.016857379 +0000 UTC m=+1623.664553649" Mar 11 19:16:18 crc kubenswrapper[4842]: I0311 19:16:18.478881 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:18 crc kubenswrapper[4842]: I0311 19:16:18.703950 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:18 crc kubenswrapper[4842]: I0311 19:16:18.724430 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:19 crc kubenswrapper[4842]: I0311 19:16:19.507136 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:20 crc kubenswrapper[4842]: I0311 19:16:20.859105 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:20 crc kubenswrapper[4842]: I0311 19:16:20.859952 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:20 crc kubenswrapper[4842]: I0311 19:16:20.887877 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:20 crc kubenswrapper[4842]: I0311 19:16:20.887967 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:21 crc kubenswrapper[4842]: I0311 19:16:21.887443 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:21 crc kubenswrapper[4842]: I0311 19:16:21.962210 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:16:21 crc kubenswrapper[4842]: E0311 19:16:21.962547 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:16:22 crc kubenswrapper[4842]: I0311 19:16:22.023540 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.183:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:16:22 crc kubenswrapper[4842]: I0311 19:16:22.023558 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:16:22 crc kubenswrapper[4842]: I0311 19:16:22.023538 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:16:22 crc kubenswrapper[4842]: I0311 19:16:22.023550 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.183:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:16:22 crc kubenswrapper[4842]: I0311 19:16:22.873261 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.080009 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.080237 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="673ea753-c8fe-44ff-8389-a42bd219aa40" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.153470 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.153665 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="e51027d9-781b-41f6-93af-8bf32501bc63" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.166319 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.166525 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-log" containerID="cri-o://ba7bcd3f4ed05a5a21376f0ece1119ce6b43548dbb0d361f0c2495fd30805e06" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.166560 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://103594b46390346cb727c5f4fde4dc80b8b1836ab75e3bd90dbcf92605ca96ee" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.177343 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.177630 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-log" containerID="cri-o://4bbac15eaca177eb9799f1a5af6ea0a73513cfefb62f4b1ae7c93b32b7fa7a4e" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.178445 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-api" containerID="cri-o://5f1ce1c51d846d279e12fc5511255acb62157ec7567c27130cb550e77f4e3e87" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.193505 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.193747 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="84f0d804-f474-49e5-8644-29e6adeb8e17" containerName="nova-kuttl-cell1-novncproxy-novncproxy" containerID="cri-o://d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.340341 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.340540 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="faab7cbc-3c65-4c49-bce1-14b8a0b091b1" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9" gracePeriod=30 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.534911 4842 generic.go:334] "Generic (PLEG): container finished" podID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerID="ba7bcd3f4ed05a5a21376f0ece1119ce6b43548dbb0d361f0c2495fd30805e06" exitCode=143 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.534992 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4aec138d-5919-431c-9b1a-ba46ae379e7f","Type":"ContainerDied","Data":"ba7bcd3f4ed05a5a21376f0ece1119ce6b43548dbb0d361f0c2495fd30805e06"} Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.537048 4842 generic.go:334] "Generic (PLEG): container finished" podID="af01486d-9b65-4e91-8ec4-107362d4988e" containerID="4bbac15eaca177eb9799f1a5af6ea0a73513cfefb62f4b1ae7c93b32b7fa7a4e" exitCode=143 Mar 11 19:16:24 crc kubenswrapper[4842]: I0311 19:16:24.537093 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"af01486d-9b65-4e91-8ec4-107362d4988e","Type":"ContainerDied","Data":"4bbac15eaca177eb9799f1a5af6ea0a73513cfefb62f4b1ae7c93b32b7fa7a4e"} Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.001645 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.009516 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7mzw\" (UniqueName: \"kubernetes.io/projected/84f0d804-f474-49e5-8644-29e6adeb8e17-kube-api-access-q7mzw\") pod \"84f0d804-f474-49e5-8644-29e6adeb8e17\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.009629 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f0d804-f474-49e5-8644-29e6adeb8e17-config-data\") pod \"84f0d804-f474-49e5-8644-29e6adeb8e17\" (UID: \"84f0d804-f474-49e5-8644-29e6adeb8e17\") " Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.019346 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f0d804-f474-49e5-8644-29e6adeb8e17-kube-api-access-q7mzw" (OuterVolumeSpecName: "kube-api-access-q7mzw") pod "84f0d804-f474-49e5-8644-29e6adeb8e17" (UID: "84f0d804-f474-49e5-8644-29e6adeb8e17"). InnerVolumeSpecName "kube-api-access-q7mzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.077510 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84f0d804-f474-49e5-8644-29e6adeb8e17-config-data" (OuterVolumeSpecName: "config-data") pod "84f0d804-f474-49e5-8644-29e6adeb8e17" (UID: "84f0d804-f474-49e5-8644-29e6adeb8e17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.111177 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7mzw\" (UniqueName: \"kubernetes.io/projected/84f0d804-f474-49e5-8644-29e6adeb8e17-kube-api-access-q7mzw\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.111212 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84f0d804-f474-49e5-8644-29e6adeb8e17-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.236447 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.415853 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673ea753-c8fe-44ff-8389-a42bd219aa40-config-data\") pod \"673ea753-c8fe-44ff-8389-a42bd219aa40\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.415962 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncfd2\" (UniqueName: \"kubernetes.io/projected/673ea753-c8fe-44ff-8389-a42bd219aa40-kube-api-access-ncfd2\") pod \"673ea753-c8fe-44ff-8389-a42bd219aa40\" (UID: \"673ea753-c8fe-44ff-8389-a42bd219aa40\") " Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.420721 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/673ea753-c8fe-44ff-8389-a42bd219aa40-kube-api-access-ncfd2" (OuterVolumeSpecName: "kube-api-access-ncfd2") pod "673ea753-c8fe-44ff-8389-a42bd219aa40" (UID: "673ea753-c8fe-44ff-8389-a42bd219aa40"). InnerVolumeSpecName "kube-api-access-ncfd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.441751 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/673ea753-c8fe-44ff-8389-a42bd219aa40-config-data" (OuterVolumeSpecName: "config-data") pod "673ea753-c8fe-44ff-8389-a42bd219aa40" (UID: "673ea753-c8fe-44ff-8389-a42bd219aa40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.518330 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/673ea753-c8fe-44ff-8389-a42bd219aa40-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.518362 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncfd2\" (UniqueName: \"kubernetes.io/projected/673ea753-c8fe-44ff-8389-a42bd219aa40-kube-api-access-ncfd2\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.546394 4842 generic.go:334] "Generic (PLEG): container finished" podID="84f0d804-f474-49e5-8644-29e6adeb8e17" containerID="d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb" exitCode=0 Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.546452 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.546483 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"84f0d804-f474-49e5-8644-29e6adeb8e17","Type":"ContainerDied","Data":"d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb"} Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.546532 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"84f0d804-f474-49e5-8644-29e6adeb8e17","Type":"ContainerDied","Data":"c63dc36577745f63980c48b26413a525a3372d4841fd19c3bdfa6ff4b1a7a64c"} Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.546550 4842 scope.go:117] "RemoveContainer" containerID="d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.549867 4842 generic.go:334] "Generic (PLEG): container finished" podID="673ea753-c8fe-44ff-8389-a42bd219aa40" containerID="aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5" exitCode=0 Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.549915 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"673ea753-c8fe-44ff-8389-a42bd219aa40","Type":"ContainerDied","Data":"aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5"} Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.549946 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"673ea753-c8fe-44ff-8389-a42bd219aa40","Type":"ContainerDied","Data":"7b4a371b390306f8cd4818fba03bdbf9dafdd467c474842cf57f919068d82e6f"} Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.550020 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.571106 4842 scope.go:117] "RemoveContainer" containerID="d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb" Mar 11 19:16:25 crc kubenswrapper[4842]: E0311 19:16:25.571577 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb\": container with ID starting with d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb not found: ID does not exist" containerID="d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.571653 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb"} err="failed to get container status \"d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb\": rpc error: code = NotFound desc = could not find container \"d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb\": container with ID starting with d5cae8b81b536c23f9802eb9cffacb84ebc73756d5d4006ff17f4b955c4c37eb not found: ID does not exist" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.571675 4842 scope.go:117] "RemoveContainer" containerID="aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.607991 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.619218 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.619992 4842 scope.go:117] "RemoveContainer" containerID="aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5" Mar 11 19:16:25 crc kubenswrapper[4842]: E0311 19:16:25.623489 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5\": container with ID starting with aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5 not found: ID does not exist" containerID="aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.623531 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5"} err="failed to get container status \"aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5\": rpc error: code = NotFound desc = could not find container \"aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5\": container with ID starting with aa43038bd144117e88681d5d87f0ffe30c454b2fe764518c1bcf3d7a3c1933a5 not found: ID does not exist" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.631536 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.640915 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.653447 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: E0311 19:16:25.653925 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673ea753-c8fe-44ff-8389-a42bd219aa40" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.653954 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="673ea753-c8fe-44ff-8389-a42bd219aa40" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:25 crc kubenswrapper[4842]: E0311 19:16:25.653985 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f0d804-f474-49e5-8644-29e6adeb8e17" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.653997 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f0d804-f474-49e5-8644-29e6adeb8e17" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.654208 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f0d804-f474-49e5-8644-29e6adeb8e17" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.654235 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="673ea753-c8fe-44ff-8389-a42bd219aa40" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.654911 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.664729 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.669549 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.672058 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.675202 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-novncproxy-config-data" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.682033 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.690426 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.822092 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.822177 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f57vt\" (UniqueName: \"kubernetes.io/projected/70231d57-b20e-4eac-aa2c-29d1a7247ee6-kube-api-access-f57vt\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.822223 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4726d5-86fc-4b0f-8c37-f8213ce10731-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.822246 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/9b4726d5-86fc-4b0f-8c37-f8213ce10731-kube-api-access-9jq8g\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.923469 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4726d5-86fc-4b0f-8c37-f8213ce10731-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.923518 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/9b4726d5-86fc-4b0f-8c37-f8213ce10731-kube-api-access-9jq8g\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.923637 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.923687 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f57vt\" (UniqueName: \"kubernetes.io/projected/70231d57-b20e-4eac-aa2c-29d1a7247ee6-kube-api-access-f57vt\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.928230 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4726d5-86fc-4b0f-8c37-f8213ce10731-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.928764 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.941861 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/9b4726d5-86fc-4b0f-8c37-f8213ce10731-kube-api-access-9jq8g\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:25 crc kubenswrapper[4842]: I0311 19:16:25.947761 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f57vt\" (UniqueName: \"kubernetes.io/projected/70231d57-b20e-4eac-aa2c-29d1a7247ee6-kube-api-access-f57vt\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.002828 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.027145 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.036391 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.126058 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlhsh\" (UniqueName: \"kubernetes.io/projected/e51027d9-781b-41f6-93af-8bf32501bc63-kube-api-access-mlhsh\") pod \"e51027d9-781b-41f6-93af-8bf32501bc63\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.126714 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51027d9-781b-41f6-93af-8bf32501bc63-config-data\") pod \"e51027d9-781b-41f6-93af-8bf32501bc63\" (UID: \"e51027d9-781b-41f6-93af-8bf32501bc63\") " Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.167013 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51027d9-781b-41f6-93af-8bf32501bc63-kube-api-access-mlhsh" (OuterVolumeSpecName: "kube-api-access-mlhsh") pod "e51027d9-781b-41f6-93af-8bf32501bc63" (UID: "e51027d9-781b-41f6-93af-8bf32501bc63"). InnerVolumeSpecName "kube-api-access-mlhsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.169436 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51027d9-781b-41f6-93af-8bf32501bc63-config-data" (OuterVolumeSpecName: "config-data") pod "e51027d9-781b-41f6-93af-8bf32501bc63" (UID: "e51027d9-781b-41f6-93af-8bf32501bc63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.230127 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e51027d9-781b-41f6-93af-8bf32501bc63-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.230166 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlhsh\" (UniqueName: \"kubernetes.io/projected/e51027d9-781b-41f6-93af-8bf32501bc63-kube-api-access-mlhsh\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.548033 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:26 crc kubenswrapper[4842]: W0311 19:16:26.549076 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4726d5_86fc_4b0f_8c37_f8213ce10731.slice/crio-d80bf1a3f7f7072b701d83d596ac38c3f605c60aed7dd06f276c6f0357c780d6 WatchSource:0}: Error finding container d80bf1a3f7f7072b701d83d596ac38c3f605c60aed7dd06f276c6f0357c780d6: Status 404 returned error can't find the container with id d80bf1a3f7f7072b701d83d596ac38c3f605c60aed7dd06f276c6f0357c780d6 Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.560996 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9b4726d5-86fc-4b0f-8c37-f8213ce10731","Type":"ContainerStarted","Data":"d80bf1a3f7f7072b701d83d596ac38c3f605c60aed7dd06f276c6f0357c780d6"} Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.563176 4842 generic.go:334] "Generic (PLEG): container finished" podID="e51027d9-781b-41f6-93af-8bf32501bc63" containerID="9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a" exitCode=0 Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.563283 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"e51027d9-781b-41f6-93af-8bf32501bc63","Type":"ContainerDied","Data":"9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a"} Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.563320 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"e51027d9-781b-41f6-93af-8bf32501bc63","Type":"ContainerDied","Data":"6ac2a8f79d7053f55ccb6a0ca082ab396a97cb21dea1185b1b4588ca80d3d1d3"} Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.563342 4842 scope.go:117] "RemoveContainer" containerID="9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.563245 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.657182 4842 scope.go:117] "RemoveContainer" containerID="9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a" Mar 11 19:16:26 crc kubenswrapper[4842]: E0311 19:16:26.661029 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a\": container with ID starting with 9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a not found: ID does not exist" containerID="9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.661247 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a"} err="failed to get container status \"9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a\": rpc error: code = NotFound desc = could not find container \"9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a\": container with ID starting with 9dadd6ee339173247e08aa206d7df2934d2ccba7cd37f145930c543491bda67a not found: ID does not exist" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.676141 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:26 crc kubenswrapper[4842]: W0311 19:16:26.686427 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70231d57_b20e_4eac_aa2c_29d1a7247ee6.slice/crio-6e8945c6350e541b48d8691e1a5dd71534857af34034fdbaab8445c47902086c WatchSource:0}: Error finding container 6e8945c6350e541b48d8691e1a5dd71534857af34034fdbaab8445c47902086c: Status 404 returned error can't find the container with id 6e8945c6350e541b48d8691e1a5dd71534857af34034fdbaab8445c47902086c Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.746192 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.762213 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.776080 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:26 crc kubenswrapper[4842]: E0311 19:16:26.777605 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51027d9-781b-41f6-93af-8bf32501bc63" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.777629 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51027d9-781b-41f6-93af-8bf32501bc63" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.777849 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51027d9-781b-41f6-93af-8bf32501bc63" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.778577 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.780778 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.790484 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:26 crc kubenswrapper[4842]: E0311 19:16:26.848120 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:16:26 crc kubenswrapper[4842]: E0311 19:16:26.850023 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:16:26 crc kubenswrapper[4842]: E0311 19:16:26.851912 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:16:26 crc kubenswrapper[4842]: E0311 19:16:26.851956 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="faab7cbc-3c65-4c49-bce1-14b8a0b091b1" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.941091 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393e9a7f-b46d-4bff-808a-68dcb6455015-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.941319 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2m8g\" (UniqueName: \"kubernetes.io/projected/393e9a7f-b46d-4bff-808a-68dcb6455015-kube-api-access-k2m8g\") pod \"nova-kuttl-scheduler-0\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.989706 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="673ea753-c8fe-44ff-8389-a42bd219aa40" path="/var/lib/kubelet/pods/673ea753-c8fe-44ff-8389-a42bd219aa40/volumes" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.990954 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f0d804-f474-49e5-8644-29e6adeb8e17" path="/var/lib/kubelet/pods/84f0d804-f474-49e5-8644-29e6adeb8e17/volumes" Mar 11 19:16:26 crc kubenswrapper[4842]: I0311 19:16:26.991717 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e51027d9-781b-41f6-93af-8bf32501bc63" path="/var/lib/kubelet/pods/e51027d9-781b-41f6-93af-8bf32501bc63/volumes" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.043238 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393e9a7f-b46d-4bff-808a-68dcb6455015-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.043994 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2m8g\" (UniqueName: \"kubernetes.io/projected/393e9a7f-b46d-4bff-808a-68dcb6455015-kube-api-access-k2m8g\") pod \"nova-kuttl-scheduler-0\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.047593 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393e9a7f-b46d-4bff-808a-68dcb6455015-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.063477 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2m8g\" (UniqueName: \"kubernetes.io/projected/393e9a7f-b46d-4bff-808a-68dcb6455015-kube-api-access-k2m8g\") pod \"nova-kuttl-scheduler-0\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.105899 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.575120 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9b4726d5-86fc-4b0f-8c37-f8213ce10731","Type":"ContainerStarted","Data":"585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90"} Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.575932 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.578644 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"70231d57-b20e-4eac-aa2c-29d1a7247ee6","Type":"ContainerStarted","Data":"77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce"} Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.578695 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"70231d57-b20e-4eac-aa2c-29d1a7247ee6","Type":"ContainerStarted","Data":"6e8945c6350e541b48d8691e1a5dd71534857af34034fdbaab8445c47902086c"} Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.582614 4842 generic.go:334] "Generic (PLEG): container finished" podID="af01486d-9b65-4e91-8ec4-107362d4988e" containerID="5f1ce1c51d846d279e12fc5511255acb62157ec7567c27130cb550e77f4e3e87" exitCode=0 Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.582669 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"af01486d-9b65-4e91-8ec4-107362d4988e","Type":"ContainerDied","Data":"5f1ce1c51d846d279e12fc5511255acb62157ec7567c27130cb550e77f4e3e87"} Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.594352 4842 generic.go:334] "Generic (PLEG): container finished" podID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerID="103594b46390346cb727c5f4fde4dc80b8b1836ab75e3bd90dbcf92605ca96ee" exitCode=0 Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.594445 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4aec138d-5919-431c-9b1a-ba46ae379e7f","Type":"ContainerDied","Data":"103594b46390346cb727c5f4fde4dc80b8b1836ab75e3bd90dbcf92605ca96ee"} Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.621056 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=2.621033604 podStartE2EDuration="2.621033604s" podCreationTimestamp="2026-03-11 19:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:27.591657987 +0000 UTC m=+1633.239354277" watchObservedRunningTime="2026-03-11 19:16:27.621033604 +0000 UTC m=+1633.268729884" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.625536 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podStartSLOduration=2.625527424 podStartE2EDuration="2.625527424s" podCreationTimestamp="2026-03-11 19:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:27.60896315 +0000 UTC m=+1633.256659440" watchObservedRunningTime="2026-03-11 19:16:27.625527424 +0000 UTC m=+1633.273223704" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.651917 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.825843 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.860854 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aec138d-5919-431c-9b1a-ba46ae379e7f-logs\") pod \"4aec138d-5919-431c-9b1a-ba46ae379e7f\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.860974 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aec138d-5919-431c-9b1a-ba46ae379e7f-config-data\") pod \"4aec138d-5919-431c-9b1a-ba46ae379e7f\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.861009 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfmx7\" (UniqueName: \"kubernetes.io/projected/4aec138d-5919-431c-9b1a-ba46ae379e7f-kube-api-access-rfmx7\") pod \"4aec138d-5919-431c-9b1a-ba46ae379e7f\" (UID: \"4aec138d-5919-431c-9b1a-ba46ae379e7f\") " Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.862333 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aec138d-5919-431c-9b1a-ba46ae379e7f-logs" (OuterVolumeSpecName: "logs") pod "4aec138d-5919-431c-9b1a-ba46ae379e7f" (UID: "4aec138d-5919-431c-9b1a-ba46ae379e7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.865454 4842 scope.go:117] "RemoveContainer" containerID="574593bc2b814737bcb08e25483588c41dae41b2f19d7eb546096a6a91a6f760" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.866243 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aec138d-5919-431c-9b1a-ba46ae379e7f-kube-api-access-rfmx7" (OuterVolumeSpecName: "kube-api-access-rfmx7") pod "4aec138d-5919-431c-9b1a-ba46ae379e7f" (UID: "4aec138d-5919-431c-9b1a-ba46ae379e7f"). InnerVolumeSpecName "kube-api-access-rfmx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.892809 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.893883 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aec138d-5919-431c-9b1a-ba46ae379e7f-config-data" (OuterVolumeSpecName: "config-data") pod "4aec138d-5919-431c-9b1a-ba46ae379e7f" (UID: "4aec138d-5919-431c-9b1a-ba46ae379e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.927585 4842 scope.go:117] "RemoveContainer" containerID="bb244240fb85d311334659590232a25399b599cb31f8c9d93233e2e094d0b8f6" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.962438 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4aec138d-5919-431c-9b1a-ba46ae379e7f-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.962485 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4aec138d-5919-431c-9b1a-ba46ae379e7f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:27 crc kubenswrapper[4842]: I0311 19:16:27.962499 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfmx7\" (UniqueName: \"kubernetes.io/projected/4aec138d-5919-431c-9b1a-ba46ae379e7f-kube-api-access-rfmx7\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.064712 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af01486d-9b65-4e91-8ec4-107362d4988e-logs\") pod \"af01486d-9b65-4e91-8ec4-107362d4988e\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.064772 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af01486d-9b65-4e91-8ec4-107362d4988e-config-data\") pod \"af01486d-9b65-4e91-8ec4-107362d4988e\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.064813 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n9gl\" (UniqueName: \"kubernetes.io/projected/af01486d-9b65-4e91-8ec4-107362d4988e-kube-api-access-8n9gl\") pod \"af01486d-9b65-4e91-8ec4-107362d4988e\" (UID: \"af01486d-9b65-4e91-8ec4-107362d4988e\") " Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.065468 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af01486d-9b65-4e91-8ec4-107362d4988e-logs" (OuterVolumeSpecName: "logs") pod "af01486d-9b65-4e91-8ec4-107362d4988e" (UID: "af01486d-9b65-4e91-8ec4-107362d4988e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.067798 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af01486d-9b65-4e91-8ec4-107362d4988e-kube-api-access-8n9gl" (OuterVolumeSpecName: "kube-api-access-8n9gl") pod "af01486d-9b65-4e91-8ec4-107362d4988e" (UID: "af01486d-9b65-4e91-8ec4-107362d4988e"). InnerVolumeSpecName "kube-api-access-8n9gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.091626 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af01486d-9b65-4e91-8ec4-107362d4988e-config-data" (OuterVolumeSpecName: "config-data") pod "af01486d-9b65-4e91-8ec4-107362d4988e" (UID: "af01486d-9b65-4e91-8ec4-107362d4988e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.166740 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af01486d-9b65-4e91-8ec4-107362d4988e-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.166789 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n9gl\" (UniqueName: \"kubernetes.io/projected/af01486d-9b65-4e91-8ec4-107362d4988e-kube-api-access-8n9gl\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.166798 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af01486d-9b65-4e91-8ec4-107362d4988e-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.621652 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"af01486d-9b65-4e91-8ec4-107362d4988e","Type":"ContainerDied","Data":"b91c4bd7d451e9ef983a5b06bff98df83ff6df938c82c5c3a51a8c7c79c21826"} Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.621733 4842 scope.go:117] "RemoveContainer" containerID="5f1ce1c51d846d279e12fc5511255acb62157ec7567c27130cb550e77f4e3e87" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.621907 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.628792 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4aec138d-5919-431c-9b1a-ba46ae379e7f","Type":"ContainerDied","Data":"43d9dd0923729eca74f57d308a8af57c61ba8b4225e1340a854443aba852711a"} Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.628901 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.632815 4842 generic.go:334] "Generic (PLEG): container finished" podID="faab7cbc-3c65-4c49-bce1-14b8a0b091b1" containerID="36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9" exitCode=0 Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.632879 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"faab7cbc-3c65-4c49-bce1-14b8a0b091b1","Type":"ContainerDied","Data":"36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9"} Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.636721 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"393e9a7f-b46d-4bff-808a-68dcb6455015","Type":"ContainerStarted","Data":"d5bf1131869db5f0928ff57f0e44c264360d6a36637ee4fa656e99ba4ad57625"} Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.636760 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"393e9a7f-b46d-4bff-808a-68dcb6455015","Type":"ContainerStarted","Data":"39c8dfb53231054ab72c42d7b4667abc3f2d40479941e041b79f78a9a2932225"} Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.662832 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.662803858 podStartE2EDuration="2.662803858s" podCreationTimestamp="2026-03-11 19:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:28.653650253 +0000 UTC m=+1634.301346573" watchObservedRunningTime="2026-03-11 19:16:28.662803858 +0000 UTC m=+1634.310500148" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.676532 4842 scope.go:117] "RemoveContainer" containerID="4bbac15eaca177eb9799f1a5af6ea0a73513cfefb62f4b1ae7c93b32b7fa7a4e" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.712392 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.731836 4842 scope.go:117] "RemoveContainer" containerID="103594b46390346cb727c5f4fde4dc80b8b1836ab75e3bd90dbcf92605ca96ee" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.753350 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.769524 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: E0311 19:16:28.775920 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-api" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.775967 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-api" Mar 11 19:16:28 crc kubenswrapper[4842]: E0311 19:16:28.775988 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-log" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.776026 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-log" Mar 11 19:16:28 crc kubenswrapper[4842]: E0311 19:16:28.776049 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-log" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.776057 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-log" Mar 11 19:16:28 crc kubenswrapper[4842]: E0311 19:16:28.776072 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.776080 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.776325 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-api" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.776357 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-log" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.776367 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" containerName="nova-kuttl-api-log" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.776386 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.778913 4842 scope.go:117] "RemoveContainer" containerID="ba7bcd3f4ed05a5a21376f0ece1119ce6b43548dbb0d361f0c2495fd30805e06" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.780786 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.781048 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.783593 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.794363 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.811730 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.820996 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.823470 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.826066 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.845090 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.915690 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.979529 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aec138d-5919-431c-9b1a-ba46ae379e7f" path="/var/lib/kubelet/pods/4aec138d-5919-431c-9b1a-ba46ae379e7f/volumes" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.980391 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af01486d-9b65-4e91-8ec4-107362d4988e" path="/var/lib/kubelet/pods/af01486d-9b65-4e91-8ec4-107362d4988e/volumes" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.984073 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f22161-086a-468c-a2b4-7d12e64ef4e3-logs\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.984138 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec95bd7-2128-4e76-a3a0-50483c052983-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.984206 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6tft\" (UniqueName: \"kubernetes.io/projected/91f22161-086a-468c-a2b4-7d12e64ef4e3-kube-api-access-m6tft\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.984240 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec95bd7-2128-4e76-a3a0-50483c052983-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.984287 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f22161-086a-468c-a2b4-7d12e64ef4e3-config-data\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:28 crc kubenswrapper[4842]: I0311 19:16:28.984319 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpkwc\" (UniqueName: \"kubernetes.io/projected/5ec95bd7-2128-4e76-a3a0-50483c052983-kube-api-access-qpkwc\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.085441 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr88z\" (UniqueName: \"kubernetes.io/projected/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-kube-api-access-dr88z\") pod \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.085591 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-config-data\") pod \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\" (UID: \"faab7cbc-3c65-4c49-bce1-14b8a0b091b1\") " Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.085862 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f22161-086a-468c-a2b4-7d12e64ef4e3-logs\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.085898 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec95bd7-2128-4e76-a3a0-50483c052983-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.085953 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6tft\" (UniqueName: \"kubernetes.io/projected/91f22161-086a-468c-a2b4-7d12e64ef4e3-kube-api-access-m6tft\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.085981 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec95bd7-2128-4e76-a3a0-50483c052983-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.086001 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f22161-086a-468c-a2b4-7d12e64ef4e3-config-data\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.086036 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkwc\" (UniqueName: \"kubernetes.io/projected/5ec95bd7-2128-4e76-a3a0-50483c052983-kube-api-access-qpkwc\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.086561 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f22161-086a-468c-a2b4-7d12e64ef4e3-logs\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.086801 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec95bd7-2128-4e76-a3a0-50483c052983-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.094964 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec95bd7-2128-4e76-a3a0-50483c052983-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.094964 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f22161-086a-468c-a2b4-7d12e64ef4e3-config-data\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.095095 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-kube-api-access-dr88z" (OuterVolumeSpecName: "kube-api-access-dr88z") pod "faab7cbc-3c65-4c49-bce1-14b8a0b091b1" (UID: "faab7cbc-3c65-4c49-bce1-14b8a0b091b1"). InnerVolumeSpecName "kube-api-access-dr88z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.110223 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6tft\" (UniqueName: \"kubernetes.io/projected/91f22161-086a-468c-a2b4-7d12e64ef4e3-kube-api-access-m6tft\") pod \"nova-kuttl-api-0\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.110834 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.120143 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpkwc\" (UniqueName: \"kubernetes.io/projected/5ec95bd7-2128-4e76-a3a0-50483c052983-kube-api-access-qpkwc\") pod \"nova-kuttl-metadata-0\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.120426 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-config-data" (OuterVolumeSpecName: "config-data") pod "faab7cbc-3c65-4c49-bce1-14b8a0b091b1" (UID: "faab7cbc-3c65-4c49-bce1-14b8a0b091b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.143857 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.187844 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.187893 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr88z\" (UniqueName: \"kubernetes.io/projected/faab7cbc-3c65-4c49-bce1-14b8a0b091b1-kube-api-access-dr88z\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.634666 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:29 crc kubenswrapper[4842]: W0311 19:16:29.640776 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91f22161_086a_468c_a2b4_7d12e64ef4e3.slice/crio-9f40344149dbffd3623f457c28fa45f3fe0fca4cceca56bd119e75cf04fe6ed1 WatchSource:0}: Error finding container 9f40344149dbffd3623f457c28fa45f3fe0fca4cceca56bd119e75cf04fe6ed1: Status 404 returned error can't find the container with id 9f40344149dbffd3623f457c28fa45f3fe0fca4cceca56bd119e75cf04fe6ed1 Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.652697 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"faab7cbc-3c65-4c49-bce1-14b8a0b091b1","Type":"ContainerDied","Data":"1b0b8423f087240664d6a5fb195bfc992b6465baf1909719d7c6f146a2380ddf"} Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.652749 4842 scope.go:117] "RemoveContainer" containerID="36d63eb011c826b0f23a5c2b83028a306d9a1282d81b15ac96a70d3a0fdc84c9" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.652871 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.710818 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.727368 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.754628 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:29 crc kubenswrapper[4842]: E0311 19:16:29.754960 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faab7cbc-3c65-4c49-bce1-14b8a0b091b1" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.754971 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="faab7cbc-3c65-4c49-bce1-14b8a0b091b1" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.755133 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="faab7cbc-3c65-4c49-bce1-14b8a0b091b1" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.755668 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.759391 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.766047 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.775966 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.801259 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66604064-161f-428b-8162-424fde211ecd-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.801439 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7zdw\" (UniqueName: \"kubernetes.io/projected/66604064-161f-428b-8162-424fde211ecd-kube-api-access-m7zdw\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.902240 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7zdw\" (UniqueName: \"kubernetes.io/projected/66604064-161f-428b-8162-424fde211ecd-kube-api-access-m7zdw\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.902394 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66604064-161f-428b-8162-424fde211ecd-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.905450 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66604064-161f-428b-8162-424fde211ecd-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:29 crc kubenswrapper[4842]: I0311 19:16:29.923090 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7zdw\" (UniqueName: \"kubernetes.io/projected/66604064-161f-428b-8162-424fde211ecd-kube-api-access-m7zdw\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.092601 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.554395 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.665873 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"66604064-161f-428b-8162-424fde211ecd","Type":"ContainerStarted","Data":"a6f94ade9e28161bf96a9727a66aa80d9871a21fce8761482cb6974804e13080"} Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.668200 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5ec95bd7-2128-4e76-a3a0-50483c052983","Type":"ContainerStarted","Data":"9e65bb4f838186eb2293c13fffd95f6a600fae53cd54f7c69f88db171e896a01"} Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.668302 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5ec95bd7-2128-4e76-a3a0-50483c052983","Type":"ContainerStarted","Data":"23fc8ea7c394b6b42343b3883419ea43615214232fc614c18a62f31f0591ff2e"} Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.668322 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5ec95bd7-2128-4e76-a3a0-50483c052983","Type":"ContainerStarted","Data":"4455aab45f997506e9619358b406ededae06b860bfdc4003ad37bc68b24c5fc8"} Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.670339 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"91f22161-086a-468c-a2b4-7d12e64ef4e3","Type":"ContainerStarted","Data":"12964dd04b1315d056149c9b3c25c4747331fb868618e027fd3ea0f3e9f92762"} Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.670400 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"91f22161-086a-468c-a2b4-7d12e64ef4e3","Type":"ContainerStarted","Data":"b105eaf6a4ac02f587d18c37345fc7acf21e3a68bbe7b061a3901e09a53cec80"} Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.670422 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"91f22161-086a-468c-a2b4-7d12e64ef4e3","Type":"ContainerStarted","Data":"9f40344149dbffd3623f457c28fa45f3fe0fca4cceca56bd119e75cf04fe6ed1"} Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.692370 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.69235147 podStartE2EDuration="2.69235147s" podCreationTimestamp="2026-03-11 19:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:30.686863613 +0000 UTC m=+1636.334559903" watchObservedRunningTime="2026-03-11 19:16:30.69235147 +0000 UTC m=+1636.340047760" Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.716068 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.716029594 podStartE2EDuration="2.716029594s" podCreationTimestamp="2026-03-11 19:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:30.710783474 +0000 UTC m=+1636.358479824" watchObservedRunningTime="2026-03-11 19:16:30.716029594 +0000 UTC m=+1636.363725864" Mar 11 19:16:30 crc kubenswrapper[4842]: I0311 19:16:30.977524 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faab7cbc-3c65-4c49-bce1-14b8a0b091b1" path="/var/lib/kubelet/pods/faab7cbc-3c65-4c49-bce1-14b8a0b091b1/volumes" Mar 11 19:16:31 crc kubenswrapper[4842]: I0311 19:16:31.037552 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:31 crc kubenswrapper[4842]: I0311 19:16:31.057517 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:31 crc kubenswrapper[4842]: I0311 19:16:31.681546 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"66604064-161f-428b-8162-424fde211ecd","Type":"ContainerStarted","Data":"941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c"} Mar 11 19:16:31 crc kubenswrapper[4842]: I0311 19:16:31.681936 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:31 crc kubenswrapper[4842]: I0311 19:16:31.705169 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=2.705140248 podStartE2EDuration="2.705140248s" podCreationTimestamp="2026-03-11 19:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:31.695913931 +0000 UTC m=+1637.343610251" watchObservedRunningTime="2026-03-11 19:16:31.705140248 +0000 UTC m=+1637.352836578" Mar 11 19:16:32 crc kubenswrapper[4842]: I0311 19:16:32.107112 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.125690 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.869939 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n"] Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.875885 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x"] Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.893603 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-nt26x"] Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.911833 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-8fd8n"] Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.962796 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:16:35 crc kubenswrapper[4842]: E0311 19:16:35.963019 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.993190 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.995208 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-log" containerID="cri-o://b105eaf6a4ac02f587d18c37345fc7acf21e3a68bbe7b061a3901e09a53cec80" gracePeriod=30 Mar 11 19:16:35 crc kubenswrapper[4842]: I0311 19:16:35.995383 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-api" containerID="cri-o://12964dd04b1315d056149c9b3c25c4747331fb868618e027fd3ea0f3e9f92762" gracePeriod=30 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.017812 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell1f473-account-delete-nxwvf"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.018795 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.038613 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.039118 4842 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" secret="" err="secret \"nova-nova-kuttl-dockercfg-zdjxx\" not found" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.058506 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1f473-account-delete-nxwvf"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.083720 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.086100 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.086361 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-log" containerID="cri-o://23fc8ea7c394b6b42343b3883419ea43615214232fc614c18a62f31f0591ff2e" gracePeriod=30 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.086822 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://9e65bb4f838186eb2293c13fffd95f6a600fae53cd54f7c69f88db171e896a01" gracePeriod=30 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.099807 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.104696 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novaapi9f5a-account-delete-bxhzn"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.106033 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.131363 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapi9f5a-account-delete-bxhzn"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.133440 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccc7z\" (UniqueName: \"kubernetes.io/projected/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-kube-api-access-ccc7z\") pod \"novacell1f473-account-delete-nxwvf\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.133485 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xjr\" (UniqueName: \"kubernetes.io/projected/8651b954-9fb9-4883-9018-21b8830a1254-kube-api-access-j2xjr\") pod \"novaapi9f5a-account-delete-bxhzn\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.133554 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8651b954-9fb9-4883-9018-21b8830a1254-operator-scripts\") pod \"novaapi9f5a-account-delete-bxhzn\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.133614 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-operator-scripts\") pod \"novacell1f473-account-delete-nxwvf\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: E0311 19:16:36.133747 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-novncproxy-config-data: secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:36 crc kubenswrapper[4842]: E0311 19:16:36.133798 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data podName:70231d57-b20e-4eac-aa2c-29d1a7247ee6 nodeName:}" failed. No retries permitted until 2026-03-11 19:16:36.633778385 +0000 UTC m=+1642.281474665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data") pod "nova-kuttl-cell1-novncproxy-0" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6") : secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.157441 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.157702 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="393e9a7f-b46d-4bff-808a-68dcb6455015" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://d5bf1131869db5f0928ff57f0e44c264360d6a36637ee4fa656e99ba4ad57625" gracePeriod=30 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.169445 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell0f700-account-delete-j8665"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.170543 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.175304 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell0f700-account-delete-j8665"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.237223 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccc7z\" (UniqueName: \"kubernetes.io/projected/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-kube-api-access-ccc7z\") pod \"novacell1f473-account-delete-nxwvf\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.237296 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xjr\" (UniqueName: \"kubernetes.io/projected/8651b954-9fb9-4883-9018-21b8830a1254-kube-api-access-j2xjr\") pod \"novaapi9f5a-account-delete-bxhzn\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.237381 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8651b954-9fb9-4883-9018-21b8830a1254-operator-scripts\") pod \"novaapi9f5a-account-delete-bxhzn\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.237443 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-operator-scripts\") pod \"novacell1f473-account-delete-nxwvf\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.238432 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-operator-scripts\") pod \"novacell1f473-account-delete-nxwvf\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.238777 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8651b954-9fb9-4883-9018-21b8830a1254-operator-scripts\") pod \"novaapi9f5a-account-delete-bxhzn\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.268931 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccc7z\" (UniqueName: \"kubernetes.io/projected/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-kube-api-access-ccc7z\") pod \"novacell1f473-account-delete-nxwvf\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.279028 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xjr\" (UniqueName: \"kubernetes.io/projected/8651b954-9fb9-4883-9018-21b8830a1254-kube-api-access-j2xjr\") pod \"novaapi9f5a-account-delete-bxhzn\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.339111 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v4f8\" (UniqueName: \"kubernetes.io/projected/8ccf3aa4-40b4-49e1-a842-8daad633be37-kube-api-access-8v4f8\") pod \"novacell0f700-account-delete-j8665\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.339178 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf3aa4-40b4-49e1-a842-8daad633be37-operator-scripts\") pod \"novacell0f700-account-delete-j8665\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.374264 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.440723 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf3aa4-40b4-49e1-a842-8daad633be37-operator-scripts\") pod \"novacell0f700-account-delete-j8665\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.440966 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v4f8\" (UniqueName: \"kubernetes.io/projected/8ccf3aa4-40b4-49e1-a842-8daad633be37-kube-api-access-8v4f8\") pod \"novacell0f700-account-delete-j8665\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.441799 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf3aa4-40b4-49e1-a842-8daad633be37-operator-scripts\") pod \"novacell0f700-account-delete-j8665\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.455558 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v4f8\" (UniqueName: \"kubernetes.io/projected/8ccf3aa4-40b4-49e1-a842-8daad633be37-kube-api-access-8v4f8\") pod \"novacell0f700-account-delete-j8665\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.476990 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.493232 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:36 crc kubenswrapper[4842]: E0311 19:16:36.644032 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-novncproxy-config-data: secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:36 crc kubenswrapper[4842]: E0311 19:16:36.644095 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data podName:70231d57-b20e-4eac-aa2c-29d1a7247ee6 nodeName:}" failed. No retries permitted until 2026-03-11 19:16:37.644078808 +0000 UTC m=+1643.291775088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data") pod "nova-kuttl-cell1-novncproxy-0" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6") : secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.721175 4842 generic.go:334] "Generic (PLEG): container finished" podID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerID="b105eaf6a4ac02f587d18c37345fc7acf21e3a68bbe7b061a3901e09a53cec80" exitCode=143 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.721209 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"91f22161-086a-468c-a2b4-7d12e64ef4e3","Type":"ContainerDied","Data":"b105eaf6a4ac02f587d18c37345fc7acf21e3a68bbe7b061a3901e09a53cec80"} Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.723138 4842 generic.go:334] "Generic (PLEG): container finished" podID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerID="9e65bb4f838186eb2293c13fffd95f6a600fae53cd54f7c69f88db171e896a01" exitCode=0 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.723165 4842 generic.go:334] "Generic (PLEG): container finished" podID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerID="23fc8ea7c394b6b42343b3883419ea43615214232fc614c18a62f31f0591ff2e" exitCode=143 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.723308 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="70231d57-b20e-4eac-aa2c-29d1a7247ee6" containerName="nova-kuttl-cell1-novncproxy-novncproxy" containerID="cri-o://77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce" gracePeriod=30 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.723370 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5ec95bd7-2128-4e76-a3a0-50483c052983","Type":"ContainerDied","Data":"9e65bb4f838186eb2293c13fffd95f6a600fae53cd54f7c69f88db171e896a01"} Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.723402 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5ec95bd7-2128-4e76-a3a0-50483c052983","Type":"ContainerDied","Data":"23fc8ea7c394b6b42343b3883419ea43615214232fc614c18a62f31f0591ff2e"} Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.734242 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.734661 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.848175 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpkwc\" (UniqueName: \"kubernetes.io/projected/5ec95bd7-2128-4e76-a3a0-50483c052983-kube-api-access-qpkwc\") pod \"5ec95bd7-2128-4e76-a3a0-50483c052983\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.848246 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec95bd7-2128-4e76-a3a0-50483c052983-logs\") pod \"5ec95bd7-2128-4e76-a3a0-50483c052983\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.848311 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec95bd7-2128-4e76-a3a0-50483c052983-config-data\") pod \"5ec95bd7-2128-4e76-a3a0-50483c052983\" (UID: \"5ec95bd7-2128-4e76-a3a0-50483c052983\") " Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.848784 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec95bd7-2128-4e76-a3a0-50483c052983-logs" (OuterVolumeSpecName: "logs") pod "5ec95bd7-2128-4e76-a3a0-50483c052983" (UID: "5ec95bd7-2128-4e76-a3a0-50483c052983"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.852522 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec95bd7-2128-4e76-a3a0-50483c052983-kube-api-access-qpkwc" (OuterVolumeSpecName: "kube-api-access-qpkwc") pod "5ec95bd7-2128-4e76-a3a0-50483c052983" (UID: "5ec95bd7-2128-4e76-a3a0-50483c052983"). InnerVolumeSpecName "kube-api-access-qpkwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.880839 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec95bd7-2128-4e76-a3a0-50483c052983-config-data" (OuterVolumeSpecName: "config-data") pod "5ec95bd7-2128-4e76-a3a0-50483c052983" (UID: "5ec95bd7-2128-4e76-a3a0-50483c052983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.917667 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapi9f5a-account-delete-bxhzn"] Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.923852 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1f473-account-delete-nxwvf"] Mar 11 19:16:36 crc kubenswrapper[4842]: W0311 19:16:36.925338 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8651b954_9fb9_4883_9018_21b8830a1254.slice/crio-688ac1970958d660e3a458422acef72a5feb8a7de78ebb606af163a7429b6d81 WatchSource:0}: Error finding container 688ac1970958d660e3a458422acef72a5feb8a7de78ebb606af163a7429b6d81: Status 404 returned error can't find the container with id 688ac1970958d660e3a458422acef72a5feb8a7de78ebb606af163a7429b6d81 Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.950753 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec95bd7-2128-4e76-a3a0-50483c052983-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.950781 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec95bd7-2128-4e76-a3a0-50483c052983-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.950817 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpkwc\" (UniqueName: \"kubernetes.io/projected/5ec95bd7-2128-4e76-a3a0-50483c052983-kube-api-access-qpkwc\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.983780 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a127e0-ea5b-46e1-92a5-d748c5415c18" path="/var/lib/kubelet/pods/b8a127e0-ea5b-46e1-92a5-d748c5415c18/volumes" Mar 11 19:16:36 crc kubenswrapper[4842]: I0311 19:16:36.984365 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd150eed-cae1-4a99-a90f-d533d05070bf" path="/var/lib/kubelet/pods/cd150eed-cae1-4a99-a90f-d533d05070bf/volumes" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.113684 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell0f700-account-delete-j8665"] Mar 11 19:16:37 crc kubenswrapper[4842]: E0311 19:16:37.685252 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-novncproxy-config-data: secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:37 crc kubenswrapper[4842]: E0311 19:16:37.685589 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data podName:70231d57-b20e-4eac-aa2c-29d1a7247ee6 nodeName:}" failed. No retries permitted until 2026-03-11 19:16:39.685569974 +0000 UTC m=+1645.333266254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data") pod "nova-kuttl-cell1-novncproxy-0" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6") : secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.738906 4842 generic.go:334] "Generic (PLEG): container finished" podID="8651b954-9fb9-4883-9018-21b8830a1254" containerID="fb21b506ca3956bb6ad0d1b5272ad2bf7fd9b0e9c4b81b82340db9b3a1e3da95" exitCode=0 Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.738978 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" event={"ID":"8651b954-9fb9-4883-9018-21b8830a1254","Type":"ContainerDied","Data":"fb21b506ca3956bb6ad0d1b5272ad2bf7fd9b0e9c4b81b82340db9b3a1e3da95"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.739017 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" event={"ID":"8651b954-9fb9-4883-9018-21b8830a1254","Type":"ContainerStarted","Data":"688ac1970958d660e3a458422acef72a5feb8a7de78ebb606af163a7429b6d81"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.740224 4842 generic.go:334] "Generic (PLEG): container finished" podID="8ccf3aa4-40b4-49e1-a842-8daad633be37" containerID="10a78b515298434e21790e36bedfcd2448cb9b6076f2a7cbdd850500bd417a84" exitCode=0 Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.740261 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0f700-account-delete-j8665" event={"ID":"8ccf3aa4-40b4-49e1-a842-8daad633be37","Type":"ContainerDied","Data":"10a78b515298434e21790e36bedfcd2448cb9b6076f2a7cbdd850500bd417a84"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.740337 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0f700-account-delete-j8665" event={"ID":"8ccf3aa4-40b4-49e1-a842-8daad633be37","Type":"ContainerStarted","Data":"70cd94fbf682ff013b670ec221e58c7013c95aa766a8df884ec6e79251fac486"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.743740 4842 generic.go:334] "Generic (PLEG): container finished" podID="6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae" containerID="39665699be4ef9ff883dcc65b468ba9074af72d2a74d759ea8562d58654fd331" exitCode=0 Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.743830 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" event={"ID":"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae","Type":"ContainerDied","Data":"39665699be4ef9ff883dcc65b468ba9074af72d2a74d759ea8562d58654fd331"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.743868 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" event={"ID":"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae","Type":"ContainerStarted","Data":"63285ea66c106d88d8ef3c5933ce646a48f813b285291209bfeffaea7e29426e"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.746070 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"5ec95bd7-2128-4e76-a3a0-50483c052983","Type":"ContainerDied","Data":"4455aab45f997506e9619358b406ededae06b860bfdc4003ad37bc68b24c5fc8"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.746119 4842 scope.go:117] "RemoveContainer" containerID="9e65bb4f838186eb2293c13fffd95f6a600fae53cd54f7c69f88db171e896a01" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.746314 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.756647 4842 generic.go:334] "Generic (PLEG): container finished" podID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerID="12964dd04b1315d056149c9b3c25c4747331fb868618e027fd3ea0f3e9f92762" exitCode=0 Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.756698 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"91f22161-086a-468c-a2b4-7d12e64ef4e3","Type":"ContainerDied","Data":"12964dd04b1315d056149c9b3c25c4747331fb868618e027fd3ea0f3e9f92762"} Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.777585 4842 scope.go:117] "RemoveContainer" containerID="23fc8ea7c394b6b42343b3883419ea43615214232fc614c18a62f31f0591ff2e" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.820726 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.837754 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.846507 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.890664 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f22161-086a-468c-a2b4-7d12e64ef4e3-logs\") pod \"91f22161-086a-468c-a2b4-7d12e64ef4e3\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.890729 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6tft\" (UniqueName: \"kubernetes.io/projected/91f22161-086a-468c-a2b4-7d12e64ef4e3-kube-api-access-m6tft\") pod \"91f22161-086a-468c-a2b4-7d12e64ef4e3\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.890750 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f22161-086a-468c-a2b4-7d12e64ef4e3-config-data\") pod \"91f22161-086a-468c-a2b4-7d12e64ef4e3\" (UID: \"91f22161-086a-468c-a2b4-7d12e64ef4e3\") " Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.891706 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91f22161-086a-468c-a2b4-7d12e64ef4e3-logs" (OuterVolumeSpecName: "logs") pod "91f22161-086a-468c-a2b4-7d12e64ef4e3" (UID: "91f22161-086a-468c-a2b4-7d12e64ef4e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.896137 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f22161-086a-468c-a2b4-7d12e64ef4e3-kube-api-access-m6tft" (OuterVolumeSpecName: "kube-api-access-m6tft") pod "91f22161-086a-468c-a2b4-7d12e64ef4e3" (UID: "91f22161-086a-468c-a2b4-7d12e64ef4e3"). InnerVolumeSpecName "kube-api-access-m6tft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.911420 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f22161-086a-468c-a2b4-7d12e64ef4e3-config-data" (OuterVolumeSpecName: "config-data") pod "91f22161-086a-468c-a2b4-7d12e64ef4e3" (UID: "91f22161-086a-468c-a2b4-7d12e64ef4e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.992661 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91f22161-086a-468c-a2b4-7d12e64ef4e3-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.992689 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6tft\" (UniqueName: \"kubernetes.io/projected/91f22161-086a-468c-a2b4-7d12e64ef4e3-kube-api-access-m6tft\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:37 crc kubenswrapper[4842]: I0311 19:16:37.992705 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91f22161-086a-468c-a2b4-7d12e64ef4e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.767393 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"91f22161-086a-468c-a2b4-7d12e64ef4e3","Type":"ContainerDied","Data":"9f40344149dbffd3623f457c28fa45f3fe0fca4cceca56bd119e75cf04fe6ed1"} Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.767726 4842 scope.go:117] "RemoveContainer" containerID="12964dd04b1315d056149c9b3c25c4747331fb868618e027fd3ea0f3e9f92762" Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.767416 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.789872 4842 scope.go:117] "RemoveContainer" containerID="b105eaf6a4ac02f587d18c37345fc7acf21e3a68bbe7b061a3901e09a53cec80" Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.804899 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.818377 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.972755 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" path="/var/lib/kubelet/pods/5ec95bd7-2128-4e76-a3a0-50483c052983/volumes" Mar 11 19:16:38 crc kubenswrapper[4842]: I0311 19:16:38.973786 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" path="/var/lib/kubelet/pods/91f22161-086a-468c-a2b4-7d12e64ef4e3/volumes" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.209142 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.311625 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.318988 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.322879 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf3aa4-40b4-49e1-a842-8daad633be37-operator-scripts\") pod \"8ccf3aa4-40b4-49e1-a842-8daad633be37\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.322922 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v4f8\" (UniqueName: \"kubernetes.io/projected/8ccf3aa4-40b4-49e1-a842-8daad633be37-kube-api-access-8v4f8\") pod \"8ccf3aa4-40b4-49e1-a842-8daad633be37\" (UID: \"8ccf3aa4-40b4-49e1-a842-8daad633be37\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.324057 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ccf3aa4-40b4-49e1-a842-8daad633be37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ccf3aa4-40b4-49e1-a842-8daad633be37" (UID: "8ccf3aa4-40b4-49e1-a842-8daad633be37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.328224 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ccf3aa4-40b4-49e1-a842-8daad633be37-kube-api-access-8v4f8" (OuterVolumeSpecName: "kube-api-access-8v4f8") pod "8ccf3aa4-40b4-49e1-a842-8daad633be37" (UID: "8ccf3aa4-40b4-49e1-a842-8daad633be37"). InnerVolumeSpecName "kube-api-access-8v4f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424166 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccc7z\" (UniqueName: \"kubernetes.io/projected/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-kube-api-access-ccc7z\") pod \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424244 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-operator-scripts\") pod \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\" (UID: \"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424333 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2xjr\" (UniqueName: \"kubernetes.io/projected/8651b954-9fb9-4883-9018-21b8830a1254-kube-api-access-j2xjr\") pod \"8651b954-9fb9-4883-9018-21b8830a1254\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424393 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8651b954-9fb9-4883-9018-21b8830a1254-operator-scripts\") pod \"8651b954-9fb9-4883-9018-21b8830a1254\" (UID: \"8651b954-9fb9-4883-9018-21b8830a1254\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424724 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae" (UID: "6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424850 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v4f8\" (UniqueName: \"kubernetes.io/projected/8ccf3aa4-40b4-49e1-a842-8daad633be37-kube-api-access-8v4f8\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424848 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8651b954-9fb9-4883-9018-21b8830a1254-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8651b954-9fb9-4883-9018-21b8830a1254" (UID: "8651b954-9fb9-4883-9018-21b8830a1254"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424869 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.424926 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ccf3aa4-40b4-49e1-a842-8daad633be37-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.427206 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-kube-api-access-ccc7z" (OuterVolumeSpecName: "kube-api-access-ccc7z") pod "6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae" (UID: "6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae"). InnerVolumeSpecName "kube-api-access-ccc7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.427321 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8651b954-9fb9-4883-9018-21b8830a1254-kube-api-access-j2xjr" (OuterVolumeSpecName: "kube-api-access-j2xjr") pod "8651b954-9fb9-4883-9018-21b8830a1254" (UID: "8651b954-9fb9-4883-9018-21b8830a1254"). InnerVolumeSpecName "kube-api-access-j2xjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.525934 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8651b954-9fb9-4883-9018-21b8830a1254-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.525963 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccc7z\" (UniqueName: \"kubernetes.io/projected/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae-kube-api-access-ccc7z\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.525973 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2xjr\" (UniqueName: \"kubernetes.io/projected/8651b954-9fb9-4883-9018-21b8830a1254-kube-api-access-j2xjr\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:39 crc kubenswrapper[4842]: E0311 19:16:39.728634 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-novncproxy-config-data: secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:39 crc kubenswrapper[4842]: E0311 19:16:39.728720 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data podName:70231d57-b20e-4eac-aa2c-29d1a7247ee6 nodeName:}" failed. No retries permitted until 2026-03-11 19:16:43.728702421 +0000 UTC m=+1649.376398701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data") pod "nova-kuttl-cell1-novncproxy-0" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6") : secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.780815 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0f700-account-delete-j8665" event={"ID":"8ccf3aa4-40b4-49e1-a842-8daad633be37","Type":"ContainerDied","Data":"70cd94fbf682ff013b670ec221e58c7013c95aa766a8df884ec6e79251fac486"} Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.781119 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70cd94fbf682ff013b670ec221e58c7013c95aa766a8df884ec6e79251fac486" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.780904 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0f700-account-delete-j8665" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.781238 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.783832 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.783838 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1f473-account-delete-nxwvf" event={"ID":"6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae","Type":"ContainerDied","Data":"63285ea66c106d88d8ef3c5933ce646a48f813b285291209bfeffaea7e29426e"} Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.783872 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63285ea66c106d88d8ef3c5933ce646a48f813b285291209bfeffaea7e29426e" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.786366 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.786365 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapi9f5a-account-delete-bxhzn" event={"ID":"8651b954-9fb9-4883-9018-21b8830a1254","Type":"ContainerDied","Data":"688ac1970958d660e3a458422acef72a5feb8a7de78ebb606af163a7429b6d81"} Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.786391 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="688ac1970958d660e3a458422acef72a5feb8a7de78ebb606af163a7429b6d81" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.788042 4842 generic.go:334] "Generic (PLEG): container finished" podID="393e9a7f-b46d-4bff-808a-68dcb6455015" containerID="d5bf1131869db5f0928ff57f0e44c264360d6a36637ee4fa656e99ba4ad57625" exitCode=0 Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.788065 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.788074 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"393e9a7f-b46d-4bff-808a-68dcb6455015","Type":"ContainerDied","Data":"d5bf1131869db5f0928ff57f0e44c264360d6a36637ee4fa656e99ba4ad57625"} Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.788139 4842 scope.go:117] "RemoveContainer" containerID="d5bf1131869db5f0928ff57f0e44c264360d6a36637ee4fa656e99ba4ad57625" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.831663 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2m8g\" (UniqueName: \"kubernetes.io/projected/393e9a7f-b46d-4bff-808a-68dcb6455015-kube-api-access-k2m8g\") pod \"393e9a7f-b46d-4bff-808a-68dcb6455015\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.831922 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393e9a7f-b46d-4bff-808a-68dcb6455015-config-data\") pod \"393e9a7f-b46d-4bff-808a-68dcb6455015\" (UID: \"393e9a7f-b46d-4bff-808a-68dcb6455015\") " Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.836255 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393e9a7f-b46d-4bff-808a-68dcb6455015-kube-api-access-k2m8g" (OuterVolumeSpecName: "kube-api-access-k2m8g") pod "393e9a7f-b46d-4bff-808a-68dcb6455015" (UID: "393e9a7f-b46d-4bff-808a-68dcb6455015"). InnerVolumeSpecName "kube-api-access-k2m8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.854098 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393e9a7f-b46d-4bff-808a-68dcb6455015-config-data" (OuterVolumeSpecName: "config-data") pod "393e9a7f-b46d-4bff-808a-68dcb6455015" (UID: "393e9a7f-b46d-4bff-808a-68dcb6455015"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.935361 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2m8g\" (UniqueName: \"kubernetes.io/projected/393e9a7f-b46d-4bff-808a-68dcb6455015-kube-api-access-k2m8g\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:39 crc kubenswrapper[4842]: I0311 19:16:39.935421 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393e9a7f-b46d-4bff-808a-68dcb6455015-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:40 crc kubenswrapper[4842]: I0311 19:16:40.125099 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:40 crc kubenswrapper[4842]: I0311 19:16:40.133206 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:16:40 crc kubenswrapper[4842]: I0311 19:16:40.975699 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393e9a7f-b46d-4bff-808a-68dcb6455015" path="/var/lib/kubelet/pods/393e9a7f-b46d-4bff-808a-68dcb6455015/volumes" Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.055854 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-552pr"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.062806 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-552pr"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.069287 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.075733 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell1f473-account-delete-nxwvf"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.081716 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-f473-account-create-update-fsrrq"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.087390 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell1f473-account-delete-nxwvf"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.143180 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-db-create-f2qc7"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.152194 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-db-create-f2qc7"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.162961 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novaapi9f5a-account-delete-bxhzn"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.169692 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.175887 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-9f5a-account-create-update-gb4rv"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.182190 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novaapi9f5a-account-delete-bxhzn"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.248216 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-sf6dp"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.256871 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-sf6dp"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.265839 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell0f700-account-delete-j8665"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.274131 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.290094 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-f700-account-create-update-w7qgq"] Mar 11 19:16:41 crc kubenswrapper[4842]: I0311 19:16:41.299685 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell0f700-account-delete-j8665"] Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.980185 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2385d679-2159-40b8-afae-681623c5faac" path="/var/lib/kubelet/pods/2385d679-2159-40b8-afae-681623c5faac/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.980865 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d18db8f-38cf-408a-9a11-48fdc55fc29f" path="/var/lib/kubelet/pods/4d18db8f-38cf-408a-9a11-48fdc55fc29f/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.982079 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf" path="/var/lib/kubelet/pods/60dad5bd-35ea-4cd0-9bb6-f00f7794bbaf/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.983173 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae" path="/var/lib/kubelet/pods/6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.984734 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8651b954-9fb9-4883-9018-21b8830a1254" path="/var/lib/kubelet/pods/8651b954-9fb9-4883-9018-21b8830a1254/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.985611 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ccf3aa4-40b4-49e1-a842-8daad633be37" path="/var/lib/kubelet/pods/8ccf3aa4-40b4-49e1-a842-8daad633be37/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.988829 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="929e25d9-24c3-457b-b067-f925aa4326ac" path="/var/lib/kubelet/pods/929e25d9-24c3-457b-b067-f925aa4326ac/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.989919 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cefa47e9-8b49-4b8f-a48a-d41d73fd62aa" path="/var/lib/kubelet/pods/cefa47e9-8b49-4b8f-a48a-d41d73fd62aa/volumes" Mar 11 19:16:42 crc kubenswrapper[4842]: I0311 19:16:42.990580 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab2ee5c-61e7-4313-af6f-8e6df74b134d" path="/var/lib/kubelet/pods/fab2ee5c-61e7-4313-af6f-8e6df74b134d/volumes" Mar 11 19:16:43 crc kubenswrapper[4842]: E0311 19:16:43.801704 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-novncproxy-config-data: secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:43 crc kubenswrapper[4842]: E0311 19:16:43.801801 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data podName:70231d57-b20e-4eac-aa2c-29d1a7247ee6 nodeName:}" failed. No retries permitted until 2026-03-11 19:16:51.80177702 +0000 UTC m=+1657.449473310 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data") pod "nova-kuttl-cell1-novncproxy-0" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6") : secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.402420 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.403430 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="9b4726d5-86fc-4b0f-8c37-f8213ce10731" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90" gracePeriod=30 Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.416335 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5"] Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.427018 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-m94z5"] Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.455879 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp"] Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.462683 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.462908 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="66604064-161f-428b-8162-424fde211ecd" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c" gracePeriod=30 Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.468202 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-q7bdp"] Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.970776 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c61366-3a2d-4208-aa39-949370bd3232" path="/var/lib/kubelet/pods/11c61366-3a2d-4208-aa39-949370bd3232/volumes" Mar 11 19:16:46 crc kubenswrapper[4842]: I0311 19:16:46.971302 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55af88a0-a4e9-40e5-8e80-a38d11f5da3f" path="/var/lib/kubelet/pods/55af88a0-a4e9-40e5-8e80-a38d11f5da3f/volumes" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.303106 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.384218 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7zdw\" (UniqueName: \"kubernetes.io/projected/66604064-161f-428b-8162-424fde211ecd-kube-api-access-m7zdw\") pod \"66604064-161f-428b-8162-424fde211ecd\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.384345 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66604064-161f-428b-8162-424fde211ecd-config-data\") pod \"66604064-161f-428b-8162-424fde211ecd\" (UID: \"66604064-161f-428b-8162-424fde211ecd\") " Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.391125 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66604064-161f-428b-8162-424fde211ecd-kube-api-access-m7zdw" (OuterVolumeSpecName: "kube-api-access-m7zdw") pod "66604064-161f-428b-8162-424fde211ecd" (UID: "66604064-161f-428b-8162-424fde211ecd"). InnerVolumeSpecName "kube-api-access-m7zdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.413178 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66604064-161f-428b-8162-424fde211ecd-config-data" (OuterVolumeSpecName: "config-data") pod "66604064-161f-428b-8162-424fde211ecd" (UID: "66604064-161f-428b-8162-424fde211ecd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.485759 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7zdw\" (UniqueName: \"kubernetes.io/projected/66604064-161f-428b-8162-424fde211ecd-kube-api-access-m7zdw\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.485796 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66604064-161f-428b-8162-424fde211ecd-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.875366 4842 generic.go:334] "Generic (PLEG): container finished" podID="66604064-161f-428b-8162-424fde211ecd" containerID="941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c" exitCode=0 Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.875441 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"66604064-161f-428b-8162-424fde211ecd","Type":"ContainerDied","Data":"941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c"} Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.875487 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"66604064-161f-428b-8162-424fde211ecd","Type":"ContainerDied","Data":"a6f94ade9e28161bf96a9727a66aa80d9871a21fce8761482cb6974804e13080"} Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.875487 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.875596 4842 scope.go:117] "RemoveContainer" containerID="941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.903586 4842 scope.go:117] "RemoveContainer" containerID="941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c" Mar 11 19:16:47 crc kubenswrapper[4842]: E0311 19:16:47.904308 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c\": container with ID starting with 941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c not found: ID does not exist" containerID="941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.904388 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c"} err="failed to get container status \"941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c\": rpc error: code = NotFound desc = could not find container \"941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c\": container with ID starting with 941a7788a0cc6d9180e4a30b841c91ba969c53fd0b148312961e04d70946398c not found: ID does not exist" Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.926559 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:47 crc kubenswrapper[4842]: I0311 19:16:47.948017 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704026 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-db-create-s8bgj"] Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704771 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393e9a7f-b46d-4bff-808a-68dcb6455015" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704785 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="393e9a7f-b46d-4bff-808a-68dcb6455015" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704801 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-log" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704806 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-log" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704816 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704823 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704837 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704843 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704854 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccf3aa4-40b4-49e1-a842-8daad633be37" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704860 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccf3aa4-40b4-49e1-a842-8daad633be37" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704873 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8651b954-9fb9-4883-9018-21b8830a1254" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704879 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8651b954-9fb9-4883-9018-21b8830a1254" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704889 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-log" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704894 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-log" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704906 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-api" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704912 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-api" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.704920 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66604064-161f-428b-8162-424fde211ecd" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.704926 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="66604064-161f-428b-8162-424fde211ecd" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705056 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6335b0fc-071c-48e6-b3ae-1c6b5de4b1ae" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705073 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-log" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705089 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="66604064-161f-428b-8162-424fde211ecd" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705109 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-api" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705119 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8651b954-9fb9-4883-9018-21b8830a1254" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705131 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec95bd7-2128-4e76-a3a0-50483c052983" containerName="nova-kuttl-metadata-metadata" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705140 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="393e9a7f-b46d-4bff-808a-68dcb6455015" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705148 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccf3aa4-40b4-49e1-a842-8daad633be37" containerName="mariadb-account-delete" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705156 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f22161-086a-468c-a2b4-7d12e64ef4e3" containerName="nova-kuttl-api-log" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.705728 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.717061 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-s8bgj"] Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.763017 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs"] Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.764757 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.769810 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-api-db-secret" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.793108 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs"] Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.811392 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e3090-6260-429d-a8cc-ff5ec73181ea-operator-scripts\") pod \"nova-api-db-create-s8bgj\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.811496 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjxn\" (UniqueName: \"kubernetes.io/projected/ea1668ef-02d4-47cd-a896-5769784534bc-kube-api-access-jkjxn\") pod \"nova-api-ec26-account-create-update-9ckbs\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.811548 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1668ef-02d4-47cd-a896-5769784534bc-operator-scripts\") pod \"nova-api-ec26-account-create-update-9ckbs\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.811626 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq7wf\" (UniqueName: \"kubernetes.io/projected/9e4e3090-6260-429d-a8cc-ff5ec73181ea-kube-api-access-rq7wf\") pod \"nova-api-db-create-s8bgj\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.821133 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gg2hz"] Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.822397 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.831573 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gg2hz"] Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.851981 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.896064 4842 generic.go:334] "Generic (PLEG): container finished" podID="9b4726d5-86fc-4b0f-8c37-f8213ce10731" containerID="585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90" exitCode=0 Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.896129 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.896135 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9b4726d5-86fc-4b0f-8c37-f8213ce10731","Type":"ContainerDied","Data":"585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90"} Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.896245 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"9b4726d5-86fc-4b0f-8c37-f8213ce10731","Type":"ContainerDied","Data":"d80bf1a3f7f7072b701d83d596ac38c3f605c60aed7dd06f276c6f0357c780d6"} Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.896265 4842 scope.go:117] "RemoveContainer" containerID="585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.906517 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-dcjpk"] Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.906984 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4726d5-86fc-4b0f-8c37-f8213ce10731" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.907004 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4726d5-86fc-4b0f-8c37-f8213ce10731" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.907207 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4726d5-86fc-4b0f-8c37-f8213ce10731" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.907954 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.925024 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4726d5-86fc-4b0f-8c37-f8213ce10731-config-data\") pod \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.925085 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/9b4726d5-86fc-4b0f-8c37-f8213ce10731-kube-api-access-9jq8g\") pod \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\" (UID: \"9b4726d5-86fc-4b0f-8c37-f8213ce10731\") " Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.925127 4842 scope.go:117] "RemoveContainer" containerID="585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.925771 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45xtl\" (UniqueName: \"kubernetes.io/projected/bb9b95ff-e9b4-4df9-895c-172bb594b59e-kube-api-access-45xtl\") pod \"nova-cell0-db-create-gg2hz\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.926147 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq7wf\" (UniqueName: \"kubernetes.io/projected/9e4e3090-6260-429d-a8cc-ff5ec73181ea-kube-api-access-rq7wf\") pod \"nova-api-db-create-s8bgj\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.926186 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e3090-6260-429d-a8cc-ff5ec73181ea-operator-scripts\") pod \"nova-api-db-create-s8bgj\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.926297 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9b95ff-e9b4-4df9-895c-172bb594b59e-operator-scripts\") pod \"nova-cell0-db-create-gg2hz\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.926355 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjxn\" (UniqueName: \"kubernetes.io/projected/ea1668ef-02d4-47cd-a896-5769784534bc-kube-api-access-jkjxn\") pod \"nova-api-ec26-account-create-update-9ckbs\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.926405 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1668ef-02d4-47cd-a896-5769784534bc-operator-scripts\") pod \"nova-api-ec26-account-create-update-9ckbs\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.927339 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e3090-6260-429d-a8cc-ff5ec73181ea-operator-scripts\") pod \"nova-api-db-create-s8bgj\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.929827 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1668ef-02d4-47cd-a896-5769784534bc-operator-scripts\") pod \"nova-api-ec26-account-create-update-9ckbs\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:48 crc kubenswrapper[4842]: E0311 19:16:48.931434 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90\": container with ID starting with 585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90 not found: ID does not exist" containerID="585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.931499 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90"} err="failed to get container status \"585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90\": rpc error: code = NotFound desc = could not find container \"585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90\": container with ID starting with 585cb5a2b3644850289663c5260df6b5e115e56e3e82648495292407dae84e90 not found: ID does not exist" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.935005 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-dcjpk"] Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.940468 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4726d5-86fc-4b0f-8c37-f8213ce10731-kube-api-access-9jq8g" (OuterVolumeSpecName: "kube-api-access-9jq8g") pod "9b4726d5-86fc-4b0f-8c37-f8213ce10731" (UID: "9b4726d5-86fc-4b0f-8c37-f8213ce10731"). InnerVolumeSpecName "kube-api-access-9jq8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.945207 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjxn\" (UniqueName: \"kubernetes.io/projected/ea1668ef-02d4-47cd-a896-5769784534bc-kube-api-access-jkjxn\") pod \"nova-api-ec26-account-create-update-9ckbs\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.948878 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq7wf\" (UniqueName: \"kubernetes.io/projected/9e4e3090-6260-429d-a8cc-ff5ec73181ea-kube-api-access-rq7wf\") pod \"nova-api-db-create-s8bgj\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.971184 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66604064-161f-428b-8162-424fde211ecd" path="/var/lib/kubelet/pods/66604064-161f-428b-8162-424fde211ecd/volumes" Mar 11 19:16:48 crc kubenswrapper[4842]: I0311 19:16:48.981584 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4726d5-86fc-4b0f-8c37-f8213ce10731-config-data" (OuterVolumeSpecName: "config-data") pod "9b4726d5-86fc-4b0f-8c37-f8213ce10731" (UID: "9b4726d5-86fc-4b0f-8c37-f8213ce10731"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.003662 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.004604 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.008621 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell0-db-secret" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.017965 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.028201 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45xtl\" (UniqueName: \"kubernetes.io/projected/bb9b95ff-e9b4-4df9-895c-172bb594b59e-kube-api-access-45xtl\") pod \"nova-cell0-db-create-gg2hz\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.028305 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cw98\" (UniqueName: \"kubernetes.io/projected/986d2be1-4400-40e0-8af9-9bb831ca357c-kube-api-access-6cw98\") pod \"nova-cell1-db-create-dcjpk\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.028351 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986d2be1-4400-40e0-8af9-9bb831ca357c-operator-scripts\") pod \"nova-cell1-db-create-dcjpk\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.028397 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9b95ff-e9b4-4df9-895c-172bb594b59e-operator-scripts\") pod \"nova-cell0-db-create-gg2hz\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.028465 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4726d5-86fc-4b0f-8c37-f8213ce10731-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.028478 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jq8g\" (UniqueName: \"kubernetes.io/projected/9b4726d5-86fc-4b0f-8c37-f8213ce10731-kube-api-access-9jq8g\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.029205 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9b95ff-e9b4-4df9-895c-172bb594b59e-operator-scripts\") pod \"nova-cell0-db-create-gg2hz\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.047136 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45xtl\" (UniqueName: \"kubernetes.io/projected/bb9b95ff-e9b4-4df9-895c-172bb594b59e-kube-api-access-45xtl\") pod \"nova-cell0-db-create-gg2hz\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.130449 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cw98\" (UniqueName: \"kubernetes.io/projected/986d2be1-4400-40e0-8af9-9bb831ca357c-kube-api-access-6cw98\") pod \"nova-cell1-db-create-dcjpk\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.130515 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986d2be1-4400-40e0-8af9-9bb831ca357c-operator-scripts\") pod \"nova-cell1-db-create-dcjpk\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.130562 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cc024c-6b02-4efc-b7aa-7b1ec6785123-operator-scripts\") pod \"nova-cell0-0c0c-account-create-update-d76cc\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.130649 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2d4\" (UniqueName: \"kubernetes.io/projected/43cc024c-6b02-4efc-b7aa-7b1ec6785123-kube-api-access-pd2d4\") pod \"nova-cell0-0c0c-account-create-update-d76cc\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.131248 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986d2be1-4400-40e0-8af9-9bb831ca357c-operator-scripts\") pod \"nova-cell1-db-create-dcjpk\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.138012 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.145021 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cw98\" (UniqueName: \"kubernetes.io/projected/986d2be1-4400-40e0-8af9-9bb831ca357c-kube-api-access-6cw98\") pod \"nova-cell1-db-create-dcjpk\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.148586 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.161621 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.211599 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.212877 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.218319 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.218888 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell1-db-secret" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.231528 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cc024c-6b02-4efc-b7aa-7b1ec6785123-operator-scripts\") pod \"nova-cell0-0c0c-account-create-update-d76cc\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.231610 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2d4\" (UniqueName: \"kubernetes.io/projected/43cc024c-6b02-4efc-b7aa-7b1ec6785123-kube-api-access-pd2d4\") pod \"nova-cell0-0c0c-account-create-update-d76cc\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.232466 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cc024c-6b02-4efc-b7aa-7b1ec6785123-operator-scripts\") pod \"nova-cell0-0c0c-account-create-update-d76cc\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.249390 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2d4\" (UniqueName: \"kubernetes.io/projected/43cc024c-6b02-4efc-b7aa-7b1ec6785123-kube-api-access-pd2d4\") pod \"nova-cell0-0c0c-account-create-update-d76cc\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.275869 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.279683 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.301204 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.322573 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.333257 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grwtm\" (UniqueName: \"kubernetes.io/projected/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-kube-api-access-grwtm\") pod \"nova-cell1-d291-account-create-update-glvdn\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.333475 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-operator-scripts\") pod \"nova-cell1-d291-account-create-update-glvdn\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.438424 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-operator-scripts\") pod \"nova-cell1-d291-account-create-update-glvdn\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.438552 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grwtm\" (UniqueName: \"kubernetes.io/projected/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-kube-api-access-grwtm\") pod \"nova-cell1-d291-account-create-update-glvdn\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.440513 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-operator-scripts\") pod \"nova-cell1-d291-account-create-update-glvdn\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.461194 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grwtm\" (UniqueName: \"kubernetes.io/projected/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-kube-api-access-grwtm\") pod \"nova-cell1-d291-account-create-update-glvdn\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.542257 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.645520 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.729609 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gg2hz"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.735829 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-s8bgj"] Mar 11 19:16:49 crc kubenswrapper[4842]: W0311 19:16:49.737323 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb9b95ff_e9b4_4df9_895c_172bb594b59e.slice/crio-9a4ae3eaa444ea69bd08ef51229a51dc456f3762ce8d58aacb25b33889df747a WatchSource:0}: Error finding container 9a4ae3eaa444ea69bd08ef51229a51dc456f3762ce8d58aacb25b33889df747a: Status 404 returned error can't find the container with id 9a4ae3eaa444ea69bd08ef51229a51dc456f3762ce8d58aacb25b33889df747a Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.867658 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc"] Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.878606 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn"] Mar 11 19:16:49 crc kubenswrapper[4842]: W0311 19:16:49.881401 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43cc024c_6b02_4efc_b7aa_7b1ec6785123.slice/crio-f84ba20c54bf6f6013bacd0fd8c8052ef966c11f3a5e9ea8b5d7fb5706b02f27 WatchSource:0}: Error finding container f84ba20c54bf6f6013bacd0fd8c8052ef966c11f3a5e9ea8b5d7fb5706b02f27: Status 404 returned error can't find the container with id f84ba20c54bf6f6013bacd0fd8c8052ef966c11f3a5e9ea8b5d7fb5706b02f27 Mar 11 19:16:49 crc kubenswrapper[4842]: W0311 19:16:49.898420 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1137b12c_774f_434b_9d92_a7d5b6ee6ef9.slice/crio-3c76babcd60c7176035f014c01ddfddd1de249f82a74ad474a48d186319e47df WatchSource:0}: Error finding container 3c76babcd60c7176035f014c01ddfddd1de249f82a74ad474a48d186319e47df: Status 404 returned error can't find the container with id 3c76babcd60c7176035f014c01ddfddd1de249f82a74ad474a48d186319e47df Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.906624 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-dcjpk"] Mar 11 19:16:49 crc kubenswrapper[4842]: W0311 19:16:49.921391 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod986d2be1_4400_40e0_8af9_9bb831ca357c.slice/crio-257494f03761e1a488f772f1f098b8d6c67f4f8dcf1d004a33e117ad8e432ce1 WatchSource:0}: Error finding container 257494f03761e1a488f772f1f098b8d6c67f4f8dcf1d004a33e117ad8e432ce1: Status 404 returned error can't find the container with id 257494f03761e1a488f772f1f098b8d6c67f4f8dcf1d004a33e117ad8e432ce1 Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.923616 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" event={"ID":"1137b12c-774f-434b-9d92-a7d5b6ee6ef9","Type":"ContainerStarted","Data":"3c76babcd60c7176035f014c01ddfddd1de249f82a74ad474a48d186319e47df"} Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.931143 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" event={"ID":"bb9b95ff-e9b4-4df9-895c-172bb594b59e","Type":"ContainerStarted","Data":"9a4ae3eaa444ea69bd08ef51229a51dc456f3762ce8d58aacb25b33889df747a"} Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.932085 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" event={"ID":"43cc024c-6b02-4efc-b7aa-7b1ec6785123","Type":"ContainerStarted","Data":"f84ba20c54bf6f6013bacd0fd8c8052ef966c11f3a5e9ea8b5d7fb5706b02f27"} Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.934490 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" event={"ID":"ea1668ef-02d4-47cd-a896-5769784534bc","Type":"ContainerStarted","Data":"b3c7f38c9a7a92bbe8480c9418530580b1bcd8873b264605f21486b39f5344e9"} Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.934533 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" event={"ID":"ea1668ef-02d4-47cd-a896-5769784534bc","Type":"ContainerStarted","Data":"cf92daf66d04677a88752d002b48b284adb63e5a10bc5edf1379eae5b6106106"} Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.935808 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-s8bgj" event={"ID":"9e4e3090-6260-429d-a8cc-ff5ec73181ea","Type":"ContainerStarted","Data":"2f0eb2825c344d3a3eab3c2650e13b0382dd2d9561542b8f4ec32acbfd01767d"} Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.949641 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" podStartSLOduration=1.94962031 podStartE2EDuration="1.94962031s" podCreationTimestamp="2026-03-11 19:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:49.944750909 +0000 UTC m=+1655.592447189" watchObservedRunningTime="2026-03-11 19:16:49.94962031 +0000 UTC m=+1655.597316590" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.963234 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:16:49 crc kubenswrapper[4842]: E0311 19:16:49.963527 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:16:49 crc kubenswrapper[4842]: I0311 19:16:49.965150 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-api-db-create-s8bgj" podStartSLOduration=1.965132986 podStartE2EDuration="1.965132986s" podCreationTimestamp="2026-03-11 19:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:49.964003555 +0000 UTC m=+1655.611699855" watchObservedRunningTime="2026-03-11 19:16:49.965132986 +0000 UTC m=+1655.612829266" Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.949883 4842 generic.go:334] "Generic (PLEG): container finished" podID="1137b12c-774f-434b-9d92-a7d5b6ee6ef9" containerID="4f9ff140f937d2755384ceb7ecb2094811c0487e008c06235ccfafaeffbda3d9" exitCode=0 Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.950063 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" event={"ID":"1137b12c-774f-434b-9d92-a7d5b6ee6ef9","Type":"ContainerDied","Data":"4f9ff140f937d2755384ceb7ecb2094811c0487e008c06235ccfafaeffbda3d9"} Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.954681 4842 generic.go:334] "Generic (PLEG): container finished" podID="bb9b95ff-e9b4-4df9-895c-172bb594b59e" containerID="ba09ac168f1c10ac0a05210d2f655e06e8c6176dd3fcf7814e9683eb36d4c425" exitCode=0 Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.954965 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" event={"ID":"bb9b95ff-e9b4-4df9-895c-172bb594b59e","Type":"ContainerDied","Data":"ba09ac168f1c10ac0a05210d2f655e06e8c6176dd3fcf7814e9683eb36d4c425"} Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.957606 4842 generic.go:334] "Generic (PLEG): container finished" podID="43cc024c-6b02-4efc-b7aa-7b1ec6785123" containerID="93583afffb96aa0e89e642e781d2da3566fa59c9b2f599d8a928dc2f2f015c5e" exitCode=0 Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.957677 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" event={"ID":"43cc024c-6b02-4efc-b7aa-7b1ec6785123","Type":"ContainerDied","Data":"93583afffb96aa0e89e642e781d2da3566fa59c9b2f599d8a928dc2f2f015c5e"} Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.959684 4842 generic.go:334] "Generic (PLEG): container finished" podID="986d2be1-4400-40e0-8af9-9bb831ca357c" containerID="9a0648be5ad20533b3bc5df12e5ec9b79576f8a6335b3eedb2acbfe909d24253" exitCode=0 Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.959738 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" event={"ID":"986d2be1-4400-40e0-8af9-9bb831ca357c","Type":"ContainerDied","Data":"9a0648be5ad20533b3bc5df12e5ec9b79576f8a6335b3eedb2acbfe909d24253"} Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.959783 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" event={"ID":"986d2be1-4400-40e0-8af9-9bb831ca357c","Type":"ContainerStarted","Data":"257494f03761e1a488f772f1f098b8d6c67f4f8dcf1d004a33e117ad8e432ce1"} Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.961124 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" event={"ID":"ea1668ef-02d4-47cd-a896-5769784534bc","Type":"ContainerDied","Data":"b3c7f38c9a7a92bbe8480c9418530580b1bcd8873b264605f21486b39f5344e9"} Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.961126 4842 generic.go:334] "Generic (PLEG): container finished" podID="ea1668ef-02d4-47cd-a896-5769784534bc" containerID="b3c7f38c9a7a92bbe8480c9418530580b1bcd8873b264605f21486b39f5344e9" exitCode=0 Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.963215 4842 generic.go:334] "Generic (PLEG): container finished" podID="9e4e3090-6260-429d-a8cc-ff5ec73181ea" containerID="f5c5e0d16de8e52cda1d25b45574f431d9af1c2ed4ac9c042a73865e25f653e9" exitCode=0 Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.974884 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4726d5-86fc-4b0f-8c37-f8213ce10731" path="/var/lib/kubelet/pods/9b4726d5-86fc-4b0f-8c37-f8213ce10731/volumes" Mar 11 19:16:50 crc kubenswrapper[4842]: I0311 19:16:50.975481 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-s8bgj" event={"ID":"9e4e3090-6260-429d-a8cc-ff5ec73181ea","Type":"ContainerDied","Data":"f5c5e0d16de8e52cda1d25b45574f431d9af1c2ed4ac9c042a73865e25f653e9"} Mar 11 19:16:51 crc kubenswrapper[4842]: E0311 19:16:51.880086 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-novncproxy-config-data: secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:51 crc kubenswrapper[4842]: E0311 19:16:51.880155 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data podName:70231d57-b20e-4eac-aa2c-29d1a7247ee6 nodeName:}" failed. No retries permitted until 2026-03-11 19:17:07.880140528 +0000 UTC m=+1673.527836808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data") pod "nova-kuttl-cell1-novncproxy-0" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6") : secret "nova-kuttl-cell1-novncproxy-config-data" not found Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.323031 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.386963 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9b95ff-e9b4-4df9-895c-172bb594b59e-operator-scripts\") pod \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.387130 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45xtl\" (UniqueName: \"kubernetes.io/projected/bb9b95ff-e9b4-4df9-895c-172bb594b59e-kube-api-access-45xtl\") pod \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\" (UID: \"bb9b95ff-e9b4-4df9-895c-172bb594b59e\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.387859 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb9b95ff-e9b4-4df9-895c-172bb594b59e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb9b95ff-e9b4-4df9-895c-172bb594b59e" (UID: "bb9b95ff-e9b4-4df9-895c-172bb594b59e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.394638 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb9b95ff-e9b4-4df9-895c-172bb594b59e-kube-api-access-45xtl" (OuterVolumeSpecName: "kube-api-access-45xtl") pod "bb9b95ff-e9b4-4df9-895c-172bb594b59e" (UID: "bb9b95ff-e9b4-4df9-895c-172bb594b59e"). InnerVolumeSpecName "kube-api-access-45xtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.489374 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45xtl\" (UniqueName: \"kubernetes.io/projected/bb9b95ff-e9b4-4df9-895c-172bb594b59e-kube-api-access-45xtl\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.489407 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9b95ff-e9b4-4df9-895c-172bb594b59e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.693442 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.698154 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.702342 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.712710 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.723212 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.793484 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e3090-6260-429d-a8cc-ff5ec73181ea-operator-scripts\") pod \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.793614 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkjxn\" (UniqueName: \"kubernetes.io/projected/ea1668ef-02d4-47cd-a896-5769784534bc-kube-api-access-jkjxn\") pod \"ea1668ef-02d4-47cd-a896-5769784534bc\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.793997 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4e3090-6260-429d-a8cc-ff5ec73181ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e4e3090-6260-429d-a8cc-ff5ec73181ea" (UID: "9e4e3090-6260-429d-a8cc-ff5ec73181ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794262 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-operator-scripts\") pod \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794318 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grwtm\" (UniqueName: \"kubernetes.io/projected/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-kube-api-access-grwtm\") pod \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\" (UID: \"1137b12c-774f-434b-9d92-a7d5b6ee6ef9\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794374 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986d2be1-4400-40e0-8af9-9bb831ca357c-operator-scripts\") pod \"986d2be1-4400-40e0-8af9-9bb831ca357c\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794437 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1668ef-02d4-47cd-a896-5769784534bc-operator-scripts\") pod \"ea1668ef-02d4-47cd-a896-5769784534bc\" (UID: \"ea1668ef-02d4-47cd-a896-5769784534bc\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794510 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd2d4\" (UniqueName: \"kubernetes.io/projected/43cc024c-6b02-4efc-b7aa-7b1ec6785123-kube-api-access-pd2d4\") pod \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794556 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cc024c-6b02-4efc-b7aa-7b1ec6785123-operator-scripts\") pod \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\" (UID: \"43cc024c-6b02-4efc-b7aa-7b1ec6785123\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794580 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cw98\" (UniqueName: \"kubernetes.io/projected/986d2be1-4400-40e0-8af9-9bb831ca357c-kube-api-access-6cw98\") pod \"986d2be1-4400-40e0-8af9-9bb831ca357c\" (UID: \"986d2be1-4400-40e0-8af9-9bb831ca357c\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.794604 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq7wf\" (UniqueName: \"kubernetes.io/projected/9e4e3090-6260-429d-a8cc-ff5ec73181ea-kube-api-access-rq7wf\") pod \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\" (UID: \"9e4e3090-6260-429d-a8cc-ff5ec73181ea\") " Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.795053 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e4e3090-6260-429d-a8cc-ff5ec73181ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.795118 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43cc024c-6b02-4efc-b7aa-7b1ec6785123-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43cc024c-6b02-4efc-b7aa-7b1ec6785123" (UID: "43cc024c-6b02-4efc-b7aa-7b1ec6785123"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.795161 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1137b12c-774f-434b-9d92-a7d5b6ee6ef9" (UID: "1137b12c-774f-434b-9d92-a7d5b6ee6ef9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.795305 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea1668ef-02d4-47cd-a896-5769784534bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea1668ef-02d4-47cd-a896-5769784534bc" (UID: "ea1668ef-02d4-47cd-a896-5769784534bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.795575 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/986d2be1-4400-40e0-8af9-9bb831ca357c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "986d2be1-4400-40e0-8af9-9bb831ca357c" (UID: "986d2be1-4400-40e0-8af9-9bb831ca357c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.796475 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1668ef-02d4-47cd-a896-5769784534bc-kube-api-access-jkjxn" (OuterVolumeSpecName: "kube-api-access-jkjxn") pod "ea1668ef-02d4-47cd-a896-5769784534bc" (UID: "ea1668ef-02d4-47cd-a896-5769784534bc"). InnerVolumeSpecName "kube-api-access-jkjxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.796987 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43cc024c-6b02-4efc-b7aa-7b1ec6785123-kube-api-access-pd2d4" (OuterVolumeSpecName: "kube-api-access-pd2d4") pod "43cc024c-6b02-4efc-b7aa-7b1ec6785123" (UID: "43cc024c-6b02-4efc-b7aa-7b1ec6785123"). InnerVolumeSpecName "kube-api-access-pd2d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.797278 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986d2be1-4400-40e0-8af9-9bb831ca357c-kube-api-access-6cw98" (OuterVolumeSpecName: "kube-api-access-6cw98") pod "986d2be1-4400-40e0-8af9-9bb831ca357c" (UID: "986d2be1-4400-40e0-8af9-9bb831ca357c"). InnerVolumeSpecName "kube-api-access-6cw98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.797414 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4e3090-6260-429d-a8cc-ff5ec73181ea-kube-api-access-rq7wf" (OuterVolumeSpecName: "kube-api-access-rq7wf") pod "9e4e3090-6260-429d-a8cc-ff5ec73181ea" (UID: "9e4e3090-6260-429d-a8cc-ff5ec73181ea"). InnerVolumeSpecName "kube-api-access-rq7wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.798164 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-kube-api-access-grwtm" (OuterVolumeSpecName: "kube-api-access-grwtm") pod "1137b12c-774f-434b-9d92-a7d5b6ee6ef9" (UID: "1137b12c-774f-434b-9d92-a7d5b6ee6ef9"). InnerVolumeSpecName "kube-api-access-grwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896526 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkjxn\" (UniqueName: \"kubernetes.io/projected/ea1668ef-02d4-47cd-a896-5769784534bc-kube-api-access-jkjxn\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896560 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896571 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grwtm\" (UniqueName: \"kubernetes.io/projected/1137b12c-774f-434b-9d92-a7d5b6ee6ef9-kube-api-access-grwtm\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896580 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986d2be1-4400-40e0-8af9-9bb831ca357c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896590 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea1668ef-02d4-47cd-a896-5769784534bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896598 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd2d4\" (UniqueName: \"kubernetes.io/projected/43cc024c-6b02-4efc-b7aa-7b1ec6785123-kube-api-access-pd2d4\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896607 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cc024c-6b02-4efc-b7aa-7b1ec6785123-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896617 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cw98\" (UniqueName: \"kubernetes.io/projected/986d2be1-4400-40e0-8af9-9bb831ca357c-kube-api-access-6cw98\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:52 crc kubenswrapper[4842]: I0311 19:16:52.896624 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq7wf\" (UniqueName: \"kubernetes.io/projected/9e4e3090-6260-429d-a8cc-ff5ec73181ea-kube-api-access-rq7wf\") on node \"crc\" DevicePath \"\"" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.007637 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.007663 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-gg2hz" event={"ID":"bb9b95ff-e9b4-4df9-895c-172bb594b59e","Type":"ContainerDied","Data":"9a4ae3eaa444ea69bd08ef51229a51dc456f3762ce8d58aacb25b33889df747a"} Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.007719 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a4ae3eaa444ea69bd08ef51229a51dc456f3762ce8d58aacb25b33889df747a" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.009991 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" event={"ID":"43cc024c-6b02-4efc-b7aa-7b1ec6785123","Type":"ContainerDied","Data":"f84ba20c54bf6f6013bacd0fd8c8052ef966c11f3a5e9ea8b5d7fb5706b02f27"} Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.010011 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.010024 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f84ba20c54bf6f6013bacd0fd8c8052ef966c11f3a5e9ea8b5d7fb5706b02f27" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.013206 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" event={"ID":"986d2be1-4400-40e0-8af9-9bb831ca357c","Type":"ContainerDied","Data":"257494f03761e1a488f772f1f098b8d6c67f4f8dcf1d004a33e117ad8e432ce1"} Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.013249 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="257494f03761e1a488f772f1f098b8d6c67f4f8dcf1d004a33e117ad8e432ce1" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.013327 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-dcjpk" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.021504 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" event={"ID":"ea1668ef-02d4-47cd-a896-5769784534bc","Type":"ContainerDied","Data":"cf92daf66d04677a88752d002b48b284adb63e5a10bc5edf1379eae5b6106106"} Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.021593 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf92daf66d04677a88752d002b48b284adb63e5a10bc5edf1379eae5b6106106" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.021661 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.027848 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-s8bgj" event={"ID":"9e4e3090-6260-429d-a8cc-ff5ec73181ea","Type":"ContainerDied","Data":"2f0eb2825c344d3a3eab3c2650e13b0382dd2d9561542b8f4ec32acbfd01767d"} Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.027894 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f0eb2825c344d3a3eab3c2650e13b0382dd2d9561542b8f4ec32acbfd01767d" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.027917 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-s8bgj" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.032943 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" event={"ID":"1137b12c-774f-434b-9d92-a7d5b6ee6ef9","Type":"ContainerDied","Data":"3c76babcd60c7176035f014c01ddfddd1de249f82a74ad474a48d186319e47df"} Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.032979 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c76babcd60c7176035f014c01ddfddd1de249f82a74ad474a48d186319e47df" Mar 11 19:16:53 crc kubenswrapper[4842]: I0311 19:16:53.033060 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.493887 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld"] Mar 11 19:16:54 crc kubenswrapper[4842]: E0311 19:16:54.495010 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9b95ff-e9b4-4df9-895c-172bb594b59e" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495032 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9b95ff-e9b4-4df9-895c-172bb594b59e" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: E0311 19:16:54.495049 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43cc024c-6b02-4efc-b7aa-7b1ec6785123" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495060 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="43cc024c-6b02-4efc-b7aa-7b1ec6785123" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: E0311 19:16:54.495081 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986d2be1-4400-40e0-8af9-9bb831ca357c" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495091 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="986d2be1-4400-40e0-8af9-9bb831ca357c" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: E0311 19:16:54.495119 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4e3090-6260-429d-a8cc-ff5ec73181ea" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495130 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4e3090-6260-429d-a8cc-ff5ec73181ea" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: E0311 19:16:54.495146 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1137b12c-774f-434b-9d92-a7d5b6ee6ef9" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495159 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1137b12c-774f-434b-9d92-a7d5b6ee6ef9" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: E0311 19:16:54.495186 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1668ef-02d4-47cd-a896-5769784534bc" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495197 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1668ef-02d4-47cd-a896-5769784534bc" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495435 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e4e3090-6260-429d-a8cc-ff5ec73181ea" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495456 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1668ef-02d4-47cd-a896-5769784534bc" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495472 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="43cc024c-6b02-4efc-b7aa-7b1ec6785123" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495498 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1137b12c-774f-434b-9d92-a7d5b6ee6ef9" containerName="mariadb-account-create-update" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495519 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="986d2be1-4400-40e0-8af9-9bb831ca357c" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.495531 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9b95ff-e9b4-4df9-895c-172bb594b59e" containerName="mariadb-database-create" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.496294 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.499009 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.499628 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-scripts" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.500010 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-nh894" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.507534 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld"] Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.630314 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.630651 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgslx\" (UniqueName: \"kubernetes.io/projected/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-kube-api-access-zgslx\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.630766 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.732152 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.732226 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgslx\" (UniqueName: \"kubernetes.io/projected/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-kube-api-access-zgslx\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.732288 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.739723 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.739893 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.769450 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgslx\" (UniqueName: \"kubernetes.io/projected/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-kube-api-access-zgslx\") pod \"nova-kuttl-cell0-conductor-db-sync-t8rld\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:54 crc kubenswrapper[4842]: I0311 19:16:54.821772 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:16:55 crc kubenswrapper[4842]: I0311 19:16:55.297444 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld"] Mar 11 19:16:55 crc kubenswrapper[4842]: W0311 19:16:55.301501 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d2faaf_7c1c_4d5b_886b_67c259fe8f77.slice/crio-8dbea4b10847834a165c8c5bdbcd6ed7b16ea0b162cb14ba1ee04ab87205f478 WatchSource:0}: Error finding container 8dbea4b10847834a165c8c5bdbcd6ed7b16ea0b162cb14ba1ee04ab87205f478: Status 404 returned error can't find the container with id 8dbea4b10847834a165c8c5bdbcd6ed7b16ea0b162cb14ba1ee04ab87205f478 Mar 11 19:16:56 crc kubenswrapper[4842]: I0311 19:16:56.069409 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" event={"ID":"56d2faaf-7c1c-4d5b-886b-67c259fe8f77","Type":"ContainerStarted","Data":"96858c3c26bffd17951c0120a7e0865a7b90b70ba13dccd00f694bb5067e9636"} Mar 11 19:16:56 crc kubenswrapper[4842]: I0311 19:16:56.069988 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" event={"ID":"56d2faaf-7c1c-4d5b-886b-67c259fe8f77","Type":"ContainerStarted","Data":"8dbea4b10847834a165c8c5bdbcd6ed7b16ea0b162cb14ba1ee04ab87205f478"} Mar 11 19:16:56 crc kubenswrapper[4842]: I0311 19:16:56.109471 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" podStartSLOduration=2.109445163 podStartE2EDuration="2.109445163s" podCreationTimestamp="2026-03-11 19:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:16:56.105518608 +0000 UTC m=+1661.753214918" watchObservedRunningTime="2026-03-11 19:16:56.109445163 +0000 UTC m=+1661.757141483" Mar 11 19:17:00 crc kubenswrapper[4842]: E0311 19:17:00.101530 4842 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d2faaf_7c1c_4d5b_886b_67c259fe8f77.slice/crio-96858c3c26bffd17951c0120a7e0865a7b90b70ba13dccd00f694bb5067e9636.scope\": RecentStats: unable to find data in memory cache]" Mar 11 19:17:00 crc kubenswrapper[4842]: I0311 19:17:00.109122 4842 generic.go:334] "Generic (PLEG): container finished" podID="56d2faaf-7c1c-4d5b-886b-67c259fe8f77" containerID="96858c3c26bffd17951c0120a7e0865a7b90b70ba13dccd00f694bb5067e9636" exitCode=0 Mar 11 19:17:00 crc kubenswrapper[4842]: I0311 19:17:00.109180 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" event={"ID":"56d2faaf-7c1c-4d5b-886b-67c259fe8f77","Type":"ContainerDied","Data":"96858c3c26bffd17951c0120a7e0865a7b90b70ba13dccd00f694bb5067e9636"} Mar 11 19:17:00 crc kubenswrapper[4842]: I0311 19:17:00.963026 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:17:00 crc kubenswrapper[4842]: E0311 19:17:00.963704 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.512823 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.662935 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgslx\" (UniqueName: \"kubernetes.io/projected/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-kube-api-access-zgslx\") pod \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.663059 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-config-data\") pod \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.663109 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-scripts\") pod \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\" (UID: \"56d2faaf-7c1c-4d5b-886b-67c259fe8f77\") " Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.668467 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-scripts" (OuterVolumeSpecName: "scripts") pod "56d2faaf-7c1c-4d5b-886b-67c259fe8f77" (UID: "56d2faaf-7c1c-4d5b-886b-67c259fe8f77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.669602 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-kube-api-access-zgslx" (OuterVolumeSpecName: "kube-api-access-zgslx") pod "56d2faaf-7c1c-4d5b-886b-67c259fe8f77" (UID: "56d2faaf-7c1c-4d5b-886b-67c259fe8f77"). InnerVolumeSpecName "kube-api-access-zgslx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.703527 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-config-data" (OuterVolumeSpecName: "config-data") pod "56d2faaf-7c1c-4d5b-886b-67c259fe8f77" (UID: "56d2faaf-7c1c-4d5b-886b-67c259fe8f77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.764758 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgslx\" (UniqueName: \"kubernetes.io/projected/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-kube-api-access-zgslx\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.764790 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:01 crc kubenswrapper[4842]: I0311 19:17:01.764803 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56d2faaf-7c1c-4d5b-886b-67c259fe8f77-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.135554 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" event={"ID":"56d2faaf-7c1c-4d5b-886b-67c259fe8f77","Type":"ContainerDied","Data":"8dbea4b10847834a165c8c5bdbcd6ed7b16ea0b162cb14ba1ee04ab87205f478"} Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.135616 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dbea4b10847834a165c8c5bdbcd6ed7b16ea0b162cb14ba1ee04ab87205f478" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.135651 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.233981 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:17:02 crc kubenswrapper[4842]: E0311 19:17:02.234299 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d2faaf-7c1c-4d5b-886b-67c259fe8f77" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.234314 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d2faaf-7c1c-4d5b-886b-67c259fe8f77" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.234477 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d2faaf-7c1c-4d5b-886b-67c259fe8f77" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.234959 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.238788 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.239684 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-nh894" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.256874 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.375788 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx2m4\" (UniqueName: \"kubernetes.io/projected/63c2afef-0b62-427f-942e-330b7a88f2b3-kube-api-access-fx2m4\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.375868 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c2afef-0b62-427f-942e-330b7a88f2b3-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.477384 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c2afef-0b62-427f-942e-330b7a88f2b3-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.477520 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx2m4\" (UniqueName: \"kubernetes.io/projected/63c2afef-0b62-427f-942e-330b7a88f2b3-kube-api-access-fx2m4\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.482451 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c2afef-0b62-427f-942e-330b7a88f2b3-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.494649 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx2m4\" (UniqueName: \"kubernetes.io/projected/63c2afef-0b62-427f-942e-330b7a88f2b3-kube-api-access-fx2m4\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:02 crc kubenswrapper[4842]: I0311 19:17:02.590405 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:03 crc kubenswrapper[4842]: I0311 19:17:03.017685 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:17:03 crc kubenswrapper[4842]: W0311 19:17:03.021425 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c2afef_0b62_427f_942e_330b7a88f2b3.slice/crio-4fb7de1146cab22a60720584aa6483ac50b99497f6dc06732724bc814f2343f2 WatchSource:0}: Error finding container 4fb7de1146cab22a60720584aa6483ac50b99497f6dc06732724bc814f2343f2: Status 404 returned error can't find the container with id 4fb7de1146cab22a60720584aa6483ac50b99497f6dc06732724bc814f2343f2 Mar 11 19:17:03 crc kubenswrapper[4842]: I0311 19:17:03.146121 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"63c2afef-0b62-427f-942e-330b7a88f2b3","Type":"ContainerStarted","Data":"4fb7de1146cab22a60720584aa6483ac50b99497f6dc06732724bc814f2343f2"} Mar 11 19:17:04 crc kubenswrapper[4842]: I0311 19:17:04.175410 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"63c2afef-0b62-427f-942e-330b7a88f2b3","Type":"ContainerStarted","Data":"5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589"} Mar 11 19:17:04 crc kubenswrapper[4842]: I0311 19:17:04.175794 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:04 crc kubenswrapper[4842]: I0311 19:17:04.194460 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=2.194439082 podStartE2EDuration="2.194439082s" podCreationTimestamp="2026-03-11 19:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:04.190177287 +0000 UTC m=+1669.837873567" watchObservedRunningTime="2026-03-11 19:17:04.194439082 +0000 UTC m=+1669.842135362" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.090507 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.149287 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f57vt\" (UniqueName: \"kubernetes.io/projected/70231d57-b20e-4eac-aa2c-29d1a7247ee6-kube-api-access-f57vt\") pod \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.149495 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data\") pod \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\" (UID: \"70231d57-b20e-4eac-aa2c-29d1a7247ee6\") " Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.168490 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70231d57-b20e-4eac-aa2c-29d1a7247ee6-kube-api-access-f57vt" (OuterVolumeSpecName: "kube-api-access-f57vt") pod "70231d57-b20e-4eac-aa2c-29d1a7247ee6" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6"). InnerVolumeSpecName "kube-api-access-f57vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.181645 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data" (OuterVolumeSpecName: "config-data") pod "70231d57-b20e-4eac-aa2c-29d1a7247ee6" (UID: "70231d57-b20e-4eac-aa2c-29d1a7247ee6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.220240 4842 generic.go:334] "Generic (PLEG): container finished" podID="70231d57-b20e-4eac-aa2c-29d1a7247ee6" containerID="77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce" exitCode=137 Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.220315 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"70231d57-b20e-4eac-aa2c-29d1a7247ee6","Type":"ContainerDied","Data":"77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce"} Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.220347 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"70231d57-b20e-4eac-aa2c-29d1a7247ee6","Type":"ContainerDied","Data":"6e8945c6350e541b48d8691e1a5dd71534857af34034fdbaab8445c47902086c"} Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.220391 4842 scope.go:117] "RemoveContainer" containerID="77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.220582 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.251030 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f57vt\" (UniqueName: \"kubernetes.io/projected/70231d57-b20e-4eac-aa2c-29d1a7247ee6-kube-api-access-f57vt\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.251350 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70231d57-b20e-4eac-aa2c-29d1a7247ee6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.264912 4842 scope.go:117] "RemoveContainer" containerID="77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce" Mar 11 19:17:07 crc kubenswrapper[4842]: E0311 19:17:07.265482 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce\": container with ID starting with 77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce not found: ID does not exist" containerID="77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.265511 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce"} err="failed to get container status \"77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce\": rpc error: code = NotFound desc = could not find container \"77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce\": container with ID starting with 77ee1b40db114dfa607bc58309db7b80588907f349d6a710f127a5e966e679ce not found: ID does not exist" Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.266745 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:17:07 crc kubenswrapper[4842]: I0311 19:17:07.272590 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:17:08 crc kubenswrapper[4842]: I0311 19:17:08.970733 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70231d57-b20e-4eac-aa2c-29d1a7247ee6" path="/var/lib/kubelet/pods/70231d57-b20e-4eac-aa2c-29d1a7247ee6/volumes" Mar 11 19:17:12 crc kubenswrapper[4842]: I0311 19:17:12.957461 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.375083 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87"] Mar 11 19:17:13 crc kubenswrapper[4842]: E0311 19:17:13.375673 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70231d57-b20e-4eac-aa2c-29d1a7247ee6" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.375696 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="70231d57-b20e-4eac-aa2c-29d1a7247ee6" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.375832 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="70231d57-b20e-4eac-aa2c-29d1a7247ee6" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.376360 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.380020 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-scripts" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.380165 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-config-data" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.387048 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.447976 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb7v8\" (UniqueName: \"kubernetes.io/projected/6f373954-daef-4bf6-a56b-7036ad380787-kube-api-access-zb7v8\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.448112 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-scripts\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.448215 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-config-data\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.549774 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-config-data\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.549843 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb7v8\" (UniqueName: \"kubernetes.io/projected/6f373954-daef-4bf6-a56b-7036ad380787-kube-api-access-zb7v8\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.549912 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-scripts\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.554708 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-config-data\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.562841 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-scripts\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.575085 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb7v8\" (UniqueName: \"kubernetes.io/projected/6f373954-daef-4bf6-a56b-7036ad380787-kube-api-access-zb7v8\") pod \"nova-kuttl-cell0-cell-mapping-7mh87\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.594309 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.595690 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.598377 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.609590 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.635547 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.638302 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.644416 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.651078 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ce69a3-094f-4115-a4b1-f8ad35e6de47-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.651157 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgthd\" (UniqueName: \"kubernetes.io/projected/71ce69a3-094f-4115-a4b1-f8ad35e6de47-kube-api-access-rgthd\") pod \"nova-kuttl-scheduler-0\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.669051 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.697608 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.723140 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.724975 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.728564 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.737113 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.752078 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.753012 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.754446 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1ad7ec-2dae-4d89-a517-70beea3f1321-config-data\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.754491 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgb7n\" (UniqueName: \"kubernetes.io/projected/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-kube-api-access-wgb7n\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.763874 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.766748 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-novncproxy-config-data" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.771697 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.782080 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ce69a3-094f-4115-a4b1-f8ad35e6de47-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.782121 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1ad7ec-2dae-4d89-a517-70beea3f1321-logs\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.782144 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.782296 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgthd\" (UniqueName: \"kubernetes.io/projected/71ce69a3-094f-4115-a4b1-f8ad35e6de47-kube-api-access-rgthd\") pod \"nova-kuttl-scheduler-0\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.782340 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rjc\" (UniqueName: \"kubernetes.io/projected/ca1ad7ec-2dae-4d89-a517-70beea3f1321-kube-api-access-n4rjc\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.787585 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ce69a3-094f-4115-a4b1-f8ad35e6de47-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.804103 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgthd\" (UniqueName: \"kubernetes.io/projected/71ce69a3-094f-4115-a4b1-f8ad35e6de47-kube-api-access-rgthd\") pod \"nova-kuttl-scheduler-0\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.885002 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.885161 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rjc\" (UniqueName: \"kubernetes.io/projected/ca1ad7ec-2dae-4d89-a517-70beea3f1321-kube-api-access-n4rjc\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.885261 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.885328 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgb7n\" (UniqueName: \"kubernetes.io/projected/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-kube-api-access-wgb7n\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.885350 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1ad7ec-2dae-4d89-a517-70beea3f1321-config-data\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.887441 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7xq\" (UniqueName: \"kubernetes.io/projected/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-kube-api-access-gw7xq\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.887521 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1ad7ec-2dae-4d89-a517-70beea3f1321-logs\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.887541 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.887975 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.888312 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1ad7ec-2dae-4d89-a517-70beea3f1321-logs\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.892292 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.896146 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1ad7ec-2dae-4d89-a517-70beea3f1321-config-data\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.904955 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgb7n\" (UniqueName: \"kubernetes.io/projected/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-kube-api-access-wgb7n\") pod \"nova-kuttl-metadata-0\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.906867 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rjc\" (UniqueName: \"kubernetes.io/projected/ca1ad7ec-2dae-4d89-a517-70beea3f1321-kube-api-access-n4rjc\") pod \"nova-kuttl-api-0\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.962437 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:17:13 crc kubenswrapper[4842]: E0311 19:17:13.962743 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.969075 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.988537 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.988683 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7xq\" (UniqueName: \"kubernetes.io/projected/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-kube-api-access-gw7xq\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:13 crc kubenswrapper[4842]: I0311 19:17:13.999163 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.001891 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.005033 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7xq\" (UniqueName: \"kubernetes.io/projected/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-kube-api-access-gw7xq\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.139819 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.185815 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.221970 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87"] Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.305475 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" event={"ID":"6f373954-daef-4bf6-a56b-7036ad380787","Type":"ContainerStarted","Data":"f9a2f893896bae0c0e580ca4fb8c4a0a1ac248a9c9b08770cb39043a5dd368ac"} Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.342953 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh"] Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.345903 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.348780 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.348926 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-scripts" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.352738 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh"] Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.400798 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qnc8\" (UniqueName: \"kubernetes.io/projected/04abba73-25fa-4cda-b3cc-6a2cc23a769b-kube-api-access-5qnc8\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.403510 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.403820 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: W0311 19:17:14.421306 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ce69a3_094f_4115_a4b1_f8ad35e6de47.slice/crio-e32d237bf9590500a9e12ab1c726f5d0818185201447e4af328d8ee4c75ef3b1 WatchSource:0}: Error finding container e32d237bf9590500a9e12ab1c726f5d0818185201447e4af328d8ee4c75ef3b1: Status 404 returned error can't find the container with id e32d237bf9590500a9e12ab1c726f5d0818185201447e4af328d8ee4c75ef3b1 Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.421875 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.495183 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:14 crc kubenswrapper[4842]: W0311 19:17:14.505015 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca1ad7ec_2dae_4d89_a517_70beea3f1321.slice/crio-56f3d99f9cb4ecd5fa4a17a195c207d1e7f3a789b932aa1f8eed37c5f593803e WatchSource:0}: Error finding container 56f3d99f9cb4ecd5fa4a17a195c207d1e7f3a789b932aa1f8eed37c5f593803e: Status 404 returned error can't find the container with id 56f3d99f9cb4ecd5fa4a17a195c207d1e7f3a789b932aa1f8eed37c5f593803e Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.505123 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qnc8\" (UniqueName: \"kubernetes.io/projected/04abba73-25fa-4cda-b3cc-6a2cc23a769b-kube-api-access-5qnc8\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.505207 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.505295 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.510222 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.512676 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.522340 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qnc8\" (UniqueName: \"kubernetes.io/projected/04abba73-25fa-4cda-b3cc-6a2cc23a769b-kube-api-access-5qnc8\") pod \"nova-kuttl-cell1-conductor-db-sync-mqfqh\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.623989 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.710989 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:17:14 crc kubenswrapper[4842]: I0311 19:17:14.712530 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:14 crc kubenswrapper[4842]: W0311 19:17:14.712663 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc31eb1f5_ab2d_48d6_82d8_6af5678a670d.slice/crio-59426aec751321b417b21dcdb2bddada7577234358debe06a9fa058bd1bb9102 WatchSource:0}: Error finding container 59426aec751321b417b21dcdb2bddada7577234358debe06a9fa058bd1bb9102: Status 404 returned error can't find the container with id 59426aec751321b417b21dcdb2bddada7577234358debe06a9fa058bd1bb9102 Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.169158 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh"] Mar 11 19:17:15 crc kubenswrapper[4842]: W0311 19:17:15.172887 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04abba73_25fa_4cda_b3cc_6a2cc23a769b.slice/crio-7f283a2280e000b3aca587e46c8ce2a3b258ec7a925a83ec587f5e76e3a522d9 WatchSource:0}: Error finding container 7f283a2280e000b3aca587e46c8ce2a3b258ec7a925a83ec587f5e76e3a522d9: Status 404 returned error can't find the container with id 7f283a2280e000b3aca587e46c8ce2a3b258ec7a925a83ec587f5e76e3a522d9 Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.332630 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" event={"ID":"04abba73-25fa-4cda-b3cc-6a2cc23a769b","Type":"ContainerStarted","Data":"b9b07fbe672763ff024113057c1090a46efba5122d370ec0c70d7beb8256f443"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.332670 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" event={"ID":"04abba73-25fa-4cda-b3cc-6a2cc23a769b","Type":"ContainerStarted","Data":"7f283a2280e000b3aca587e46c8ce2a3b258ec7a925a83ec587f5e76e3a522d9"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.334231 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" event={"ID":"6f373954-daef-4bf6-a56b-7036ad380787","Type":"ContainerStarted","Data":"47619cbc0df4fb985600d1822015290e0b829210d1ab7c6623e688f54c1a4fb5"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.336801 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"71ce69a3-094f-4115-a4b1-f8ad35e6de47","Type":"ContainerStarted","Data":"023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.336831 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"71ce69a3-094f-4115-a4b1-f8ad35e6de47","Type":"ContainerStarted","Data":"e32d237bf9590500a9e12ab1c726f5d0818185201447e4af328d8ee4c75ef3b1"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.339294 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"c31eb1f5-ab2d-48d6-82d8-6af5678a670d","Type":"ContainerStarted","Data":"df1ce42939844cc243229418a966ebb2f97994c12840e662c2d9a0ccb21bbd50"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.339341 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"c31eb1f5-ab2d-48d6-82d8-6af5678a670d","Type":"ContainerStarted","Data":"59426aec751321b417b21dcdb2bddada7577234358debe06a9fa058bd1bb9102"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.340712 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"ca1ad7ec-2dae-4d89-a517-70beea3f1321","Type":"ContainerStarted","Data":"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.340740 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"ca1ad7ec-2dae-4d89-a517-70beea3f1321","Type":"ContainerStarted","Data":"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.340751 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"ca1ad7ec-2dae-4d89-a517-70beea3f1321","Type":"ContainerStarted","Data":"56f3d99f9cb4ecd5fa4a17a195c207d1e7f3a789b932aa1f8eed37c5f593803e"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.344467 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec","Type":"ContainerStarted","Data":"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.344497 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec","Type":"ContainerStarted","Data":"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.344506 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec","Type":"ContainerStarted","Data":"eb1ce23cac7fb34b6af80294761d511dd321171d07ffd0c1fdd1080da5a9aec1"} Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.371208 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" podStartSLOduration=1.371186105 podStartE2EDuration="1.371186105s" podCreationTimestamp="2026-03-11 19:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:15.353880481 +0000 UTC m=+1681.001576781" watchObservedRunningTime="2026-03-11 19:17:15.371186105 +0000 UTC m=+1681.018882385" Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.371715 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.371709439 podStartE2EDuration="2.371709439s" podCreationTimestamp="2026-03-11 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:15.367062374 +0000 UTC m=+1681.014758654" watchObservedRunningTime="2026-03-11 19:17:15.371709439 +0000 UTC m=+1681.019405729" Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.384096 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" podStartSLOduration=2.38407815 podStartE2EDuration="2.38407815s" podCreationTimestamp="2026-03-11 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:15.379928089 +0000 UTC m=+1681.027624389" watchObservedRunningTime="2026-03-11 19:17:15.38407815 +0000 UTC m=+1681.031774430" Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.404139 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podStartSLOduration=2.404117877 podStartE2EDuration="2.404117877s" podCreationTimestamp="2026-03-11 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:15.398069025 +0000 UTC m=+1681.045765325" watchObservedRunningTime="2026-03-11 19:17:15.404117877 +0000 UTC m=+1681.051814147" Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.417062 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.416966681 podStartE2EDuration="2.416966681s" podCreationTimestamp="2026-03-11 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:15.412600844 +0000 UTC m=+1681.060297144" watchObservedRunningTime="2026-03-11 19:17:15.416966681 +0000 UTC m=+1681.064662981" Mar 11 19:17:15 crc kubenswrapper[4842]: I0311 19:17:15.436389 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.436361811 podStartE2EDuration="2.436361811s" podCreationTimestamp="2026-03-11 19:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:15.428901631 +0000 UTC m=+1681.076597921" watchObservedRunningTime="2026-03-11 19:17:15.436361811 +0000 UTC m=+1681.084058101" Mar 11 19:17:18 crc kubenswrapper[4842]: I0311 19:17:18.402865 4842 generic.go:334] "Generic (PLEG): container finished" podID="04abba73-25fa-4cda-b3cc-6a2cc23a769b" containerID="b9b07fbe672763ff024113057c1090a46efba5122d370ec0c70d7beb8256f443" exitCode=0 Mar 11 19:17:18 crc kubenswrapper[4842]: I0311 19:17:18.403448 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" event={"ID":"04abba73-25fa-4cda-b3cc-6a2cc23a769b","Type":"ContainerDied","Data":"b9b07fbe672763ff024113057c1090a46efba5122d370ec0c70d7beb8256f443"} Mar 11 19:17:18 crc kubenswrapper[4842]: I0311 19:17:18.975455 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.186621 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.422627 4842 generic.go:334] "Generic (PLEG): container finished" podID="6f373954-daef-4bf6-a56b-7036ad380787" containerID="47619cbc0df4fb985600d1822015290e0b829210d1ab7c6623e688f54c1a4fb5" exitCode=0 Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.422689 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" event={"ID":"6f373954-daef-4bf6-a56b-7036ad380787","Type":"ContainerDied","Data":"47619cbc0df4fb985600d1822015290e0b829210d1ab7c6623e688f54c1a4fb5"} Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.731044 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.828593 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qnc8\" (UniqueName: \"kubernetes.io/projected/04abba73-25fa-4cda-b3cc-6a2cc23a769b-kube-api-access-5qnc8\") pod \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.828655 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-scripts\") pod \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.828702 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-config-data\") pod \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\" (UID: \"04abba73-25fa-4cda-b3cc-6a2cc23a769b\") " Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.833265 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-scripts" (OuterVolumeSpecName: "scripts") pod "04abba73-25fa-4cda-b3cc-6a2cc23a769b" (UID: "04abba73-25fa-4cda-b3cc-6a2cc23a769b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.833746 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04abba73-25fa-4cda-b3cc-6a2cc23a769b-kube-api-access-5qnc8" (OuterVolumeSpecName: "kube-api-access-5qnc8") pod "04abba73-25fa-4cda-b3cc-6a2cc23a769b" (UID: "04abba73-25fa-4cda-b3cc-6a2cc23a769b"). InnerVolumeSpecName "kube-api-access-5qnc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.855504 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-config-data" (OuterVolumeSpecName: "config-data") pod "04abba73-25fa-4cda-b3cc-6a2cc23a769b" (UID: "04abba73-25fa-4cda-b3cc-6a2cc23a769b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.930234 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qnc8\" (UniqueName: \"kubernetes.io/projected/04abba73-25fa-4cda-b3cc-6a2cc23a769b-kube-api-access-5qnc8\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.930280 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:19 crc kubenswrapper[4842]: I0311 19:17:19.930292 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04abba73-25fa-4cda-b3cc-6a2cc23a769b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.441155 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" event={"ID":"04abba73-25fa-4cda-b3cc-6a2cc23a769b","Type":"ContainerDied","Data":"7f283a2280e000b3aca587e46c8ce2a3b258ec7a925a83ec587f5e76e3a522d9"} Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.442524 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f283a2280e000b3aca587e46c8ce2a3b258ec7a925a83ec587f5e76e3a522d9" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.441190 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.505635 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:17:20 crc kubenswrapper[4842]: E0311 19:17:20.506416 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04abba73-25fa-4cda-b3cc-6a2cc23a769b" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.506446 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="04abba73-25fa-4cda-b3cc-6a2cc23a769b" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.506741 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="04abba73-25fa-4cda-b3cc-6a2cc23a769b" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.507634 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.512724 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.519171 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.644114 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c49e464-bc56-4675-a6e9-9e5997a85430-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.644333 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-592zd\" (UniqueName: \"kubernetes.io/projected/6c49e464-bc56-4675-a6e9-9e5997a85430-kube-api-access-592zd\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.745558 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-592zd\" (UniqueName: \"kubernetes.io/projected/6c49e464-bc56-4675-a6e9-9e5997a85430-kube-api-access-592zd\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.745617 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c49e464-bc56-4675-a6e9-9e5997a85430-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.751402 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c49e464-bc56-4675-a6e9-9e5997a85430-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.761196 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-592zd\" (UniqueName: \"kubernetes.io/projected/6c49e464-bc56-4675-a6e9-9e5997a85430-kube-api-access-592zd\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.809326 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.836109 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.949841 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-config-data\") pod \"6f373954-daef-4bf6-a56b-7036ad380787\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.949952 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-scripts\") pod \"6f373954-daef-4bf6-a56b-7036ad380787\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.950074 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb7v8\" (UniqueName: \"kubernetes.io/projected/6f373954-daef-4bf6-a56b-7036ad380787-kube-api-access-zb7v8\") pod \"6f373954-daef-4bf6-a56b-7036ad380787\" (UID: \"6f373954-daef-4bf6-a56b-7036ad380787\") " Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.953593 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f373954-daef-4bf6-a56b-7036ad380787-kube-api-access-zb7v8" (OuterVolumeSpecName: "kube-api-access-zb7v8") pod "6f373954-daef-4bf6-a56b-7036ad380787" (UID: "6f373954-daef-4bf6-a56b-7036ad380787"). InnerVolumeSpecName "kube-api-access-zb7v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.954442 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-scripts" (OuterVolumeSpecName: "scripts") pod "6f373954-daef-4bf6-a56b-7036ad380787" (UID: "6f373954-daef-4bf6-a56b-7036ad380787"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:20 crc kubenswrapper[4842]: I0311 19:17:20.972498 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-config-data" (OuterVolumeSpecName: "config-data") pod "6f373954-daef-4bf6-a56b-7036ad380787" (UID: "6f373954-daef-4bf6-a56b-7036ad380787"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.052843 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.052869 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb7v8\" (UniqueName: \"kubernetes.io/projected/6f373954-daef-4bf6-a56b-7036ad380787-kube-api-access-zb7v8\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.052883 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f373954-daef-4bf6-a56b-7036ad380787-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.259680 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:17:21 crc kubenswrapper[4842]: W0311 19:17:21.260091 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c49e464_bc56_4675_a6e9_9e5997a85430.slice/crio-3ef4eab3428a2d27404c1b003f56340e8c846a9b371062cec5c0dbd5a49a7505 WatchSource:0}: Error finding container 3ef4eab3428a2d27404c1b003f56340e8c846a9b371062cec5c0dbd5a49a7505: Status 404 returned error can't find the container with id 3ef4eab3428a2d27404c1b003f56340e8c846a9b371062cec5c0dbd5a49a7505 Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.454309 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"6c49e464-bc56-4675-a6e9-9e5997a85430","Type":"ContainerStarted","Data":"9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0"} Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.454366 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"6c49e464-bc56-4675-a6e9-9e5997a85430","Type":"ContainerStarted","Data":"3ef4eab3428a2d27404c1b003f56340e8c846a9b371062cec5c0dbd5a49a7505"} Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.454494 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.455719 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" event={"ID":"6f373954-daef-4bf6-a56b-7036ad380787","Type":"ContainerDied","Data":"f9a2f893896bae0c0e580ca4fb8c4a0a1ac248a9c9b08770cb39043a5dd368ac"} Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.455743 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9a2f893896bae0c0e580ca4fb8c4a0a1ac248a9c9b08770cb39043a5dd368ac" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.455813 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.498436 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=1.498416115 podStartE2EDuration="1.498416115s" podCreationTimestamp="2026-03-11 19:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:21.476944659 +0000 UTC m=+1687.124640959" watchObservedRunningTime="2026-03-11 19:17:21.498416115 +0000 UTC m=+1687.146112405" Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.637471 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.637751 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-log" containerID="cri-o://526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad" gracePeriod=30 Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.637821 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-api" containerID="cri-o://7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d" gracePeriod=30 Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.658840 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.659110 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="71ce69a3-094f-4115-a4b1-f8ad35e6de47" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2" gracePeriod=30 Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.761362 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.761567 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-log" containerID="cri-o://852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad" gracePeriod=30 Mar 11 19:17:21 crc kubenswrapper[4842]: I0311 19:17:21.761731 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3" gracePeriod=30 Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.145150 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.239554 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.274453 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rjc\" (UniqueName: \"kubernetes.io/projected/ca1ad7ec-2dae-4d89-a517-70beea3f1321-kube-api-access-n4rjc\") pod \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.274599 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1ad7ec-2dae-4d89-a517-70beea3f1321-config-data\") pod \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.274730 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1ad7ec-2dae-4d89-a517-70beea3f1321-logs\") pod \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\" (UID: \"ca1ad7ec-2dae-4d89-a517-70beea3f1321\") " Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.275249 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1ad7ec-2dae-4d89-a517-70beea3f1321-logs" (OuterVolumeSpecName: "logs") pod "ca1ad7ec-2dae-4d89-a517-70beea3f1321" (UID: "ca1ad7ec-2dae-4d89-a517-70beea3f1321"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.280184 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1ad7ec-2dae-4d89-a517-70beea3f1321-kube-api-access-n4rjc" (OuterVolumeSpecName: "kube-api-access-n4rjc") pod "ca1ad7ec-2dae-4d89-a517-70beea3f1321" (UID: "ca1ad7ec-2dae-4d89-a517-70beea3f1321"). InnerVolumeSpecName "kube-api-access-n4rjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.297649 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1ad7ec-2dae-4d89-a517-70beea3f1321-config-data" (OuterVolumeSpecName: "config-data") pod "ca1ad7ec-2dae-4d89-a517-70beea3f1321" (UID: "ca1ad7ec-2dae-4d89-a517-70beea3f1321"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.375684 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgb7n\" (UniqueName: \"kubernetes.io/projected/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-kube-api-access-wgb7n\") pod \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.375762 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-logs\") pod \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.375811 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-config-data\") pod \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\" (UID: \"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec\") " Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.376195 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1ad7ec-2dae-4d89-a517-70beea3f1321-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.376208 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rjc\" (UniqueName: \"kubernetes.io/projected/ca1ad7ec-2dae-4d89-a517-70beea3f1321-kube-api-access-n4rjc\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.376218 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1ad7ec-2dae-4d89-a517-70beea3f1321-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.376607 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-logs" (OuterVolumeSpecName: "logs") pod "9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" (UID: "9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.378454 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-kube-api-access-wgb7n" (OuterVolumeSpecName: "kube-api-access-wgb7n") pod "9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" (UID: "9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec"). InnerVolumeSpecName "kube-api-access-wgb7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.403627 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-config-data" (OuterVolumeSpecName: "config-data") pod "9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" (UID: "9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.479182 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgb7n\" (UniqueName: \"kubernetes.io/projected/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-kube-api-access-wgb7n\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.479232 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.479253 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.480579 4842 generic.go:334] "Generic (PLEG): container finished" podID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerID="7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d" exitCode=0 Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.480622 4842 generic.go:334] "Generic (PLEG): container finished" podID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerID="526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad" exitCode=143 Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.480688 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"ca1ad7ec-2dae-4d89-a517-70beea3f1321","Type":"ContainerDied","Data":"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d"} Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.480765 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"ca1ad7ec-2dae-4d89-a517-70beea3f1321","Type":"ContainerDied","Data":"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad"} Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.480795 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"ca1ad7ec-2dae-4d89-a517-70beea3f1321","Type":"ContainerDied","Data":"56f3d99f9cb4ecd5fa4a17a195c207d1e7f3a789b932aa1f8eed37c5f593803e"} Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.480824 4842 scope.go:117] "RemoveContainer" containerID="7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.480977 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.489456 4842 generic.go:334] "Generic (PLEG): container finished" podID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerID="1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3" exitCode=0 Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.489585 4842 generic.go:334] "Generic (PLEG): container finished" podID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerID="852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad" exitCode=143 Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.489721 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.489730 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec","Type":"ContainerDied","Data":"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3"} Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.489800 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec","Type":"ContainerDied","Data":"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad"} Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.489811 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec","Type":"ContainerDied","Data":"eb1ce23cac7fb34b6af80294761d511dd321171d07ffd0c1fdd1080da5a9aec1"} Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.522430 4842 scope.go:117] "RemoveContainer" containerID="526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.530290 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.539413 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.554155 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.563230 4842 scope.go:117] "RemoveContainer" containerID="7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.567113 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.567629 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-log" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.567658 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-log" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.567703 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f373954-daef-4bf6-a56b-7036ad380787" containerName="nova-manage" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.567713 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f373954-daef-4bf6-a56b-7036ad380787" containerName="nova-manage" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.567727 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-metadata" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.567738 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-metadata" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.567758 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-api" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.567769 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-api" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.567796 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-log" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.567808 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-log" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.567994 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-log" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.568012 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f373954-daef-4bf6-a56b-7036ad380787" containerName="nova-manage" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.568029 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" containerName="nova-kuttl-api-api" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.568042 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-log" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.568057 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" containerName="nova-kuttl-metadata-metadata" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.568053 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d\": container with ID starting with 7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d not found: ID does not exist" containerID="7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.568188 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d"} err="failed to get container status \"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d\": rpc error: code = NotFound desc = could not find container \"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d\": container with ID starting with 7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.568215 4842 scope.go:117] "RemoveContainer" containerID="526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.569251 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.569736 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad\": container with ID starting with 526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad not found: ID does not exist" containerID="526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.569849 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad"} err="failed to get container status \"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad\": rpc error: code = NotFound desc = could not find container \"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad\": container with ID starting with 526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.569935 4842 scope.go:117] "RemoveContainer" containerID="7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.570620 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d"} err="failed to get container status \"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d\": rpc error: code = NotFound desc = could not find container \"7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d\": container with ID starting with 7f98ee11e7fde3e611f45c9c41f38917a1d640f60afe2a4b222abcf1e0376f9d not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.570657 4842 scope.go:117] "RemoveContainer" containerID="526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.570990 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad"} err="failed to get container status \"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad\": rpc error: code = NotFound desc = could not find container \"526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad\": container with ID starting with 526e4e080ec60a59085f623d420a6838ed1799f7ace2a38dd7fcc6119c6decad not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.571068 4842 scope.go:117] "RemoveContainer" containerID="1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.578802 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.585791 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.589200 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.599606 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.601188 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.603210 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.608158 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.610423 4842 scope.go:117] "RemoveContainer" containerID="852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.634808 4842 scope.go:117] "RemoveContainer" containerID="1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.635262 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3\": container with ID starting with 1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3 not found: ID does not exist" containerID="1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.635320 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3"} err="failed to get container status \"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3\": rpc error: code = NotFound desc = could not find container \"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3\": container with ID starting with 1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3 not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.635346 4842 scope.go:117] "RemoveContainer" containerID="852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad" Mar 11 19:17:22 crc kubenswrapper[4842]: E0311 19:17:22.635769 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad\": container with ID starting with 852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad not found: ID does not exist" containerID="852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.635808 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad"} err="failed to get container status \"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad\": rpc error: code = NotFound desc = could not find container \"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad\": container with ID starting with 852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.635858 4842 scope.go:117] "RemoveContainer" containerID="1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.636134 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3"} err="failed to get container status \"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3\": rpc error: code = NotFound desc = could not find container \"1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3\": container with ID starting with 1d1754237b678864de085b2459bbd77b7674b727a6ca04976722787e0d604af3 not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.636156 4842 scope.go:117] "RemoveContainer" containerID="852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.636399 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad"} err="failed to get container status \"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad\": rpc error: code = NotFound desc = could not find container \"852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad\": container with ID starting with 852b91ce59c0e61ed6927133948deb6614871e28282e4d4e04224295d1bc82ad not found: ID does not exist" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.685732 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqp59\" (UniqueName: \"kubernetes.io/projected/836d17ca-fb8f-44fb-864b-064593e3eb90-kube-api-access-fqp59\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.685822 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.685880 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcbv\" (UniqueName: \"kubernetes.io/projected/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-kube-api-access-8gcbv\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.685929 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836d17ca-fb8f-44fb-864b-064593e3eb90-config-data\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.686112 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.686238 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/836d17ca-fb8f-44fb-864b-064593e3eb90-logs\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.787039 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.787104 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/836d17ca-fb8f-44fb-864b-064593e3eb90-logs\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.787151 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqp59\" (UniqueName: \"kubernetes.io/projected/836d17ca-fb8f-44fb-864b-064593e3eb90-kube-api-access-fqp59\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.787176 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.787208 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcbv\" (UniqueName: \"kubernetes.io/projected/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-kube-api-access-8gcbv\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.787233 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836d17ca-fb8f-44fb-864b-064593e3eb90-config-data\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.787793 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/836d17ca-fb8f-44fb-864b-064593e3eb90-logs\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.788743 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.794149 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836d17ca-fb8f-44fb-864b-064593e3eb90-config-data\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.808896 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.818603 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcbv\" (UniqueName: \"kubernetes.io/projected/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-kube-api-access-8gcbv\") pod \"nova-kuttl-metadata-0\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.818989 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqp59\" (UniqueName: \"kubernetes.io/projected/836d17ca-fb8f-44fb-864b-064593e3eb90-kube-api-access-fqp59\") pod \"nova-kuttl-api-0\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.893546 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.917602 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.978689 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec" path="/var/lib/kubelet/pods/9ab3e6ef-d1e3-4a0a-9d36-691c0570a6ec/volumes" Mar 11 19:17:22 crc kubenswrapper[4842]: I0311 19:17:22.980255 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1ad7ec-2dae-4d89-a517-70beea3f1321" path="/var/lib/kubelet/pods/ca1ad7ec-2dae-4d89-a517-70beea3f1321/volumes" Mar 11 19:17:23 crc kubenswrapper[4842]: I0311 19:17:23.365669 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:23 crc kubenswrapper[4842]: W0311 19:17:23.371236 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf24b3ca4_4a22_4937_93bd_8baad18bdf5e.slice/crio-ea893e7298e2e9796d1a655012875f31dad7ae5556b10d852898144447c189ff WatchSource:0}: Error finding container ea893e7298e2e9796d1a655012875f31dad7ae5556b10d852898144447c189ff: Status 404 returned error can't find the container with id ea893e7298e2e9796d1a655012875f31dad7ae5556b10d852898144447c189ff Mar 11 19:17:23 crc kubenswrapper[4842]: I0311 19:17:23.475209 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:23 crc kubenswrapper[4842]: I0311 19:17:23.499766 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"f24b3ca4-4a22-4937-93bd-8baad18bdf5e","Type":"ContainerStarted","Data":"ea893e7298e2e9796d1a655012875f31dad7ae5556b10d852898144447c189ff"} Mar 11 19:17:23 crc kubenswrapper[4842]: I0311 19:17:23.503686 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"836d17ca-fb8f-44fb-864b-064593e3eb90","Type":"ContainerStarted","Data":"d148be3f36956bd1dd87e87289480f322aa7f67c6c681cde8dcff77936db0fde"} Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.187065 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.196007 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.525228 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"836d17ca-fb8f-44fb-864b-064593e3eb90","Type":"ContainerStarted","Data":"951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc"} Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.525593 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"836d17ca-fb8f-44fb-864b-064593e3eb90","Type":"ContainerStarted","Data":"3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973"} Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.529915 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"f24b3ca4-4a22-4937-93bd-8baad18bdf5e","Type":"ContainerStarted","Data":"bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf"} Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.529986 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"f24b3ca4-4a22-4937-93bd-8baad18bdf5e","Type":"ContainerStarted","Data":"56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b"} Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.545742 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.555482 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.555445368 podStartE2EDuration="2.555445368s" podCreationTimestamp="2026-03-11 19:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:24.548727598 +0000 UTC m=+1690.196423888" watchObservedRunningTime="2026-03-11 19:17:24.555445368 +0000 UTC m=+1690.203141648" Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.607677 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.607656267 podStartE2EDuration="2.607656267s" podCreationTimestamp="2026-03-11 19:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:24.594632408 +0000 UTC m=+1690.242328718" watchObservedRunningTime="2026-03-11 19:17:24.607656267 +0000 UTC m=+1690.255352557" Mar 11 19:17:24 crc kubenswrapper[4842]: I0311 19:17:24.967034 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:17:24 crc kubenswrapper[4842]: E0311 19:17:24.967255 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:17:25 crc kubenswrapper[4842]: I0311 19:17:25.997752 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.145592 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgthd\" (UniqueName: \"kubernetes.io/projected/71ce69a3-094f-4115-a4b1-f8ad35e6de47-kube-api-access-rgthd\") pod \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.145754 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ce69a3-094f-4115-a4b1-f8ad35e6de47-config-data\") pod \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\" (UID: \"71ce69a3-094f-4115-a4b1-f8ad35e6de47\") " Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.154451 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ce69a3-094f-4115-a4b1-f8ad35e6de47-kube-api-access-rgthd" (OuterVolumeSpecName: "kube-api-access-rgthd") pod "71ce69a3-094f-4115-a4b1-f8ad35e6de47" (UID: "71ce69a3-094f-4115-a4b1-f8ad35e6de47"). InnerVolumeSpecName "kube-api-access-rgthd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.174122 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ce69a3-094f-4115-a4b1-f8ad35e6de47-config-data" (OuterVolumeSpecName: "config-data") pod "71ce69a3-094f-4115-a4b1-f8ad35e6de47" (UID: "71ce69a3-094f-4115-a4b1-f8ad35e6de47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.247871 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71ce69a3-094f-4115-a4b1-f8ad35e6de47-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.248106 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgthd\" (UniqueName: \"kubernetes.io/projected/71ce69a3-094f-4115-a4b1-f8ad35e6de47-kube-api-access-rgthd\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.557603 4842 generic.go:334] "Generic (PLEG): container finished" podID="71ce69a3-094f-4115-a4b1-f8ad35e6de47" containerID="023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2" exitCode=0 Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.557647 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"71ce69a3-094f-4115-a4b1-f8ad35e6de47","Type":"ContainerDied","Data":"023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2"} Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.557673 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"71ce69a3-094f-4115-a4b1-f8ad35e6de47","Type":"ContainerDied","Data":"e32d237bf9590500a9e12ab1c726f5d0818185201447e4af328d8ee4c75ef3b1"} Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.557678 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.557715 4842 scope.go:117] "RemoveContainer" containerID="023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.590690 4842 scope.go:117] "RemoveContainer" containerID="023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2" Mar 11 19:17:26 crc kubenswrapper[4842]: E0311 19:17:26.603446 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2\": container with ID starting with 023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2 not found: ID does not exist" containerID="023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.603525 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2"} err="failed to get container status \"023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2\": rpc error: code = NotFound desc = could not find container \"023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2\": container with ID starting with 023266fd0e47a32961e7ba540ceac30ffc3130baa19ba7c8baebb1a63450cff2 not found: ID does not exist" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.609070 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.625956 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.645544 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:26 crc kubenswrapper[4842]: E0311 19:17:26.646044 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ce69a3-094f-4115-a4b1-f8ad35e6de47" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.646074 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ce69a3-094f-4115-a4b1-f8ad35e6de47" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.646417 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ce69a3-094f-4115-a4b1-f8ad35e6de47" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.647992 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.652040 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.682812 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.759540 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/422727cb-d558-4e33-a86b-3c130d3206a2-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.759769 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ldj4\" (UniqueName: \"kubernetes.io/projected/422727cb-d558-4e33-a86b-3c130d3206a2-kube-api-access-8ldj4\") pod \"nova-kuttl-scheduler-0\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.861148 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/422727cb-d558-4e33-a86b-3c130d3206a2-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.861619 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ldj4\" (UniqueName: \"kubernetes.io/projected/422727cb-d558-4e33-a86b-3c130d3206a2-kube-api-access-8ldj4\") pod \"nova-kuttl-scheduler-0\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.869700 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/422727cb-d558-4e33-a86b-3c130d3206a2-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.881607 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ldj4\" (UniqueName: \"kubernetes.io/projected/422727cb-d558-4e33-a86b-3c130d3206a2-kube-api-access-8ldj4\") pod \"nova-kuttl-scheduler-0\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.970635 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ce69a3-094f-4115-a4b1-f8ad35e6de47" path="/var/lib/kubelet/pods/71ce69a3-094f-4115-a4b1-f8ad35e6de47/volumes" Mar 11 19:17:26 crc kubenswrapper[4842]: I0311 19:17:26.980235 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:27 crc kubenswrapper[4842]: I0311 19:17:27.489546 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:27 crc kubenswrapper[4842]: I0311 19:17:27.569095 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"422727cb-d558-4e33-a86b-3c130d3206a2","Type":"ContainerStarted","Data":"c3822d535a3b16fc45cdad2dbadd802d6824f49460d0991c50594ab48a9ec17b"} Mar 11 19:17:28 crc kubenswrapper[4842]: I0311 19:17:28.582386 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"422727cb-d558-4e33-a86b-3c130d3206a2","Type":"ContainerStarted","Data":"55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352"} Mar 11 19:17:28 crc kubenswrapper[4842]: I0311 19:17:28.609163 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.6091437170000003 podStartE2EDuration="2.609143717s" podCreationTimestamp="2026-03-11 19:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:28.600702531 +0000 UTC m=+1694.248398851" watchObservedRunningTime="2026-03-11 19:17:28.609143717 +0000 UTC m=+1694.256839987" Mar 11 19:17:30 crc kubenswrapper[4842]: I0311 19:17:30.867911 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.354756 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl"] Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.356113 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.358700 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-scripts" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.358941 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-config-data" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.370099 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl"] Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.450357 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-scripts\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.450595 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf7ph\" (UniqueName: \"kubernetes.io/projected/ca705f61-aadc-49bd-a249-0bd50776875c-kube-api-access-lf7ph\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.450684 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-config-data\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.553106 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf7ph\" (UniqueName: \"kubernetes.io/projected/ca705f61-aadc-49bd-a249-0bd50776875c-kube-api-access-lf7ph\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.553335 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-config-data\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.553490 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-scripts\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.560039 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-scripts\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.560570 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-config-data\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.595688 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf7ph\" (UniqueName: \"kubernetes.io/projected/ca705f61-aadc-49bd-a249-0bd50776875c-kube-api-access-lf7ph\") pod \"nova-kuttl-cell1-cell-mapping-xssrl\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.682522 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.981084 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:31 crc kubenswrapper[4842]: I0311 19:17:31.985632 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl"] Mar 11 19:17:32 crc kubenswrapper[4842]: I0311 19:17:32.623105 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" event={"ID":"ca705f61-aadc-49bd-a249-0bd50776875c","Type":"ContainerStarted","Data":"209c76ad00e8050ef56d0498c117c3619a868a3e2238905d7977851871e32d90"} Mar 11 19:17:32 crc kubenswrapper[4842]: I0311 19:17:32.623151 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" event={"ID":"ca705f61-aadc-49bd-a249-0bd50776875c","Type":"ContainerStarted","Data":"2c2c1a9fed7ed67587fec1f3bf41994d62c0f2bcce6dac3e27fcc4e9095e8354"} Mar 11 19:17:32 crc kubenswrapper[4842]: I0311 19:17:32.645486 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" podStartSLOduration=1.6454653609999998 podStartE2EDuration="1.645465361s" podCreationTimestamp="2026-03-11 19:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:32.636012098 +0000 UTC m=+1698.283708428" watchObservedRunningTime="2026-03-11 19:17:32.645465361 +0000 UTC m=+1698.293161641" Mar 11 19:17:32 crc kubenswrapper[4842]: I0311 19:17:32.894637 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:32 crc kubenswrapper[4842]: I0311 19:17:32.894685 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:32 crc kubenswrapper[4842]: I0311 19:17:32.919014 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:32 crc kubenswrapper[4842]: I0311 19:17:32.919065 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:33 crc kubenswrapper[4842]: I0311 19:17:33.976533 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.211:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:34 crc kubenswrapper[4842]: I0311 19:17:34.058557 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:34 crc kubenswrapper[4842]: I0311 19:17:34.058594 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.211:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:34 crc kubenswrapper[4842]: I0311 19:17:34.058672 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:36 crc kubenswrapper[4842]: I0311 19:17:36.660618 4842 generic.go:334] "Generic (PLEG): container finished" podID="ca705f61-aadc-49bd-a249-0bd50776875c" containerID="209c76ad00e8050ef56d0498c117c3619a868a3e2238905d7977851871e32d90" exitCode=0 Mar 11 19:17:36 crc kubenswrapper[4842]: I0311 19:17:36.660700 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" event={"ID":"ca705f61-aadc-49bd-a249-0bd50776875c","Type":"ContainerDied","Data":"209c76ad00e8050ef56d0498c117c3619a868a3e2238905d7977851871e32d90"} Mar 11 19:17:36 crc kubenswrapper[4842]: I0311 19:17:36.980840 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:37 crc kubenswrapper[4842]: I0311 19:17:37.030801 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:37 crc kubenswrapper[4842]: I0311 19:17:37.702390 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:37 crc kubenswrapper[4842]: I0311 19:17:37.962290 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:17:37 crc kubenswrapper[4842]: E0311 19:17:37.962853 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:17:37 crc kubenswrapper[4842]: I0311 19:17:37.986351 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.064134 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf7ph\" (UniqueName: \"kubernetes.io/projected/ca705f61-aadc-49bd-a249-0bd50776875c-kube-api-access-lf7ph\") pod \"ca705f61-aadc-49bd-a249-0bd50776875c\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.064254 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-config-data\") pod \"ca705f61-aadc-49bd-a249-0bd50776875c\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.064421 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-scripts\") pod \"ca705f61-aadc-49bd-a249-0bd50776875c\" (UID: \"ca705f61-aadc-49bd-a249-0bd50776875c\") " Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.070621 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-scripts" (OuterVolumeSpecName: "scripts") pod "ca705f61-aadc-49bd-a249-0bd50776875c" (UID: "ca705f61-aadc-49bd-a249-0bd50776875c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.082428 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca705f61-aadc-49bd-a249-0bd50776875c-kube-api-access-lf7ph" (OuterVolumeSpecName: "kube-api-access-lf7ph") pod "ca705f61-aadc-49bd-a249-0bd50776875c" (UID: "ca705f61-aadc-49bd-a249-0bd50776875c"). InnerVolumeSpecName "kube-api-access-lf7ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.093143 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-config-data" (OuterVolumeSpecName: "config-data") pod "ca705f61-aadc-49bd-a249-0bd50776875c" (UID: "ca705f61-aadc-49bd-a249-0bd50776875c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.168069 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.168794 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca705f61-aadc-49bd-a249-0bd50776875c-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.168819 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf7ph\" (UniqueName: \"kubernetes.io/projected/ca705f61-aadc-49bd-a249-0bd50776875c-kube-api-access-lf7ph\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.681798 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" event={"ID":"ca705f61-aadc-49bd-a249-0bd50776875c","Type":"ContainerDied","Data":"2c2c1a9fed7ed67587fec1f3bf41994d62c0f2bcce6dac3e27fcc4e9095e8354"} Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.682118 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2c1a9fed7ed67587fec1f3bf41994d62c0f2bcce6dac3e27fcc4e9095e8354" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.681835 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl" Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.862433 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.862659 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-log" containerID="cri-o://3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973" gracePeriod=30 Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.863387 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-api" containerID="cri-o://951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc" gracePeriod=30 Mar 11 19:17:38 crc kubenswrapper[4842]: I0311 19:17:38.946408 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:38.999848 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:39.000215 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-log" containerID="cri-o://56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b" gracePeriod=30 Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:39.000360 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf" gracePeriod=30 Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:39.694876 4842 generic.go:334] "Generic (PLEG): container finished" podID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerID="56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b" exitCode=143 Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:39.694963 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"f24b3ca4-4a22-4937-93bd-8baad18bdf5e","Type":"ContainerDied","Data":"56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b"} Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:39.697196 4842 generic.go:334] "Generic (PLEG): container finished" podID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerID="3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973" exitCode=143 Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:39.697248 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"836d17ca-fb8f-44fb-864b-064593e3eb90","Type":"ContainerDied","Data":"3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973"} Mar 11 19:17:39 crc kubenswrapper[4842]: I0311 19:17:39.697392 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="422727cb-d558-4e33-a86b-3c130d3206a2" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352" gracePeriod=30 Mar 11 19:17:40 crc kubenswrapper[4842]: I0311 19:17:40.893602 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:40 crc kubenswrapper[4842]: I0311 19:17:40.894981 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:40 crc kubenswrapper[4842]: I0311 19:17:40.918525 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:40 crc kubenswrapper[4842]: I0311 19:17:40.918614 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:41 crc kubenswrapper[4842]: E0311 19:17:41.982615 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:17:41 crc kubenswrapper[4842]: E0311 19:17:41.985314 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:17:41 crc kubenswrapper[4842]: E0311 19:17:41.988083 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:17:41 crc kubenswrapper[4842]: E0311 19:17:41.988147 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="422727cb-d558-4e33-a86b-3c130d3206a2" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.437691 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.522895 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.543352 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqp59\" (UniqueName: \"kubernetes.io/projected/836d17ca-fb8f-44fb-864b-064593e3eb90-kube-api-access-fqp59\") pod \"836d17ca-fb8f-44fb-864b-064593e3eb90\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.543506 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836d17ca-fb8f-44fb-864b-064593e3eb90-config-data\") pod \"836d17ca-fb8f-44fb-864b-064593e3eb90\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.543608 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/836d17ca-fb8f-44fb-864b-064593e3eb90-logs\") pod \"836d17ca-fb8f-44fb-864b-064593e3eb90\" (UID: \"836d17ca-fb8f-44fb-864b-064593e3eb90\") " Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.544074 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/836d17ca-fb8f-44fb-864b-064593e3eb90-logs" (OuterVolumeSpecName: "logs") pod "836d17ca-fb8f-44fb-864b-064593e3eb90" (UID: "836d17ca-fb8f-44fb-864b-064593e3eb90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.549324 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836d17ca-fb8f-44fb-864b-064593e3eb90-kube-api-access-fqp59" (OuterVolumeSpecName: "kube-api-access-fqp59") pod "836d17ca-fb8f-44fb-864b-064593e3eb90" (UID: "836d17ca-fb8f-44fb-864b-064593e3eb90"). InnerVolumeSpecName "kube-api-access-fqp59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.566433 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836d17ca-fb8f-44fb-864b-064593e3eb90-config-data" (OuterVolumeSpecName: "config-data") pod "836d17ca-fb8f-44fb-864b-064593e3eb90" (UID: "836d17ca-fb8f-44fb-864b-064593e3eb90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.645483 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-config-data\") pod \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.645656 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gcbv\" (UniqueName: \"kubernetes.io/projected/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-kube-api-access-8gcbv\") pod \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.645713 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-logs\") pod \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\" (UID: \"f24b3ca4-4a22-4937-93bd-8baad18bdf5e\") " Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.645981 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836d17ca-fb8f-44fb-864b-064593e3eb90-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.646000 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/836d17ca-fb8f-44fb-864b-064593e3eb90-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.646011 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqp59\" (UniqueName: \"kubernetes.io/projected/836d17ca-fb8f-44fb-864b-064593e3eb90-kube-api-access-fqp59\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.646233 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-logs" (OuterVolumeSpecName: "logs") pod "f24b3ca4-4a22-4937-93bd-8baad18bdf5e" (UID: "f24b3ca4-4a22-4937-93bd-8baad18bdf5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.648101 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-kube-api-access-8gcbv" (OuterVolumeSpecName: "kube-api-access-8gcbv") pod "f24b3ca4-4a22-4937-93bd-8baad18bdf5e" (UID: "f24b3ca4-4a22-4937-93bd-8baad18bdf5e"). InnerVolumeSpecName "kube-api-access-8gcbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.663868 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-config-data" (OuterVolumeSpecName: "config-data") pod "f24b3ca4-4a22-4937-93bd-8baad18bdf5e" (UID: "f24b3ca4-4a22-4937-93bd-8baad18bdf5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.731436 4842 generic.go:334] "Generic (PLEG): container finished" podID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerID="951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc" exitCode=0 Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.731795 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"836d17ca-fb8f-44fb-864b-064593e3eb90","Type":"ContainerDied","Data":"951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc"} Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.731826 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"836d17ca-fb8f-44fb-864b-064593e3eb90","Type":"ContainerDied","Data":"d148be3f36956bd1dd87e87289480f322aa7f67c6c681cde8dcff77936db0fde"} Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.731848 4842 scope.go:117] "RemoveContainer" containerID="951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.731990 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.735245 4842 generic.go:334] "Generic (PLEG): container finished" podID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerID="bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf" exitCode=0 Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.735318 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"f24b3ca4-4a22-4937-93bd-8baad18bdf5e","Type":"ContainerDied","Data":"bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf"} Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.735354 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"f24b3ca4-4a22-4937-93bd-8baad18bdf5e","Type":"ContainerDied","Data":"ea893e7298e2e9796d1a655012875f31dad7ae5556b10d852898144447c189ff"} Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.735425 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.749052 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gcbv\" (UniqueName: \"kubernetes.io/projected/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-kube-api-access-8gcbv\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.749096 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.749110 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24b3ca4-4a22-4937-93bd-8baad18bdf5e-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.761517 4842 scope.go:117] "RemoveContainer" containerID="3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.809752 4842 scope.go:117] "RemoveContainer" containerID="951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc" Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.810886 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc\": container with ID starting with 951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc not found: ID does not exist" containerID="951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.810934 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc"} err="failed to get container status \"951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc\": rpc error: code = NotFound desc = could not find container \"951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc\": container with ID starting with 951b3e3ab94107976f35e95c3df146af76f76ea9503479f181c918fd196320dc not found: ID does not exist" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.810990 4842 scope.go:117] "RemoveContainer" containerID="3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973" Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.812676 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973\": container with ID starting with 3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973 not found: ID does not exist" containerID="3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.812707 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973"} err="failed to get container status \"3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973\": rpc error: code = NotFound desc = could not find container \"3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973\": container with ID starting with 3d10c9e73ef7e19ecf4dde5d378f16ce5d205663aa3f7cba0ff6b65fbc5c9973 not found: ID does not exist" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.812726 4842 scope.go:117] "RemoveContainer" containerID="bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.813822 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.823226 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.832985 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.841384 4842 scope.go:117] "RemoveContainer" containerID="56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.847666 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859194 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.859545 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-log" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859561 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-log" Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.859585 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca705f61-aadc-49bd-a249-0bd50776875c" containerName="nova-manage" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859594 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca705f61-aadc-49bd-a249-0bd50776875c" containerName="nova-manage" Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.859608 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-api" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859616 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-api" Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.859635 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-log" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859641 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-log" Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.859651 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-metadata" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859656 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-metadata" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859813 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca705f61-aadc-49bd-a249-0bd50776875c" containerName="nova-manage" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859828 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-api" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859838 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" containerName="nova-kuttl-api-log" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859847 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-metadata" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.859858 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" containerName="nova-kuttl-metadata-log" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.861150 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.863718 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.877497 4842 scope.go:117] "RemoveContainer" containerID="bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf" Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.880552 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf\": container with ID starting with bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf not found: ID does not exist" containerID="bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.880720 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf"} err="failed to get container status \"bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf\": rpc error: code = NotFound desc = could not find container \"bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf\": container with ID starting with bd50dbd6646784e35d4b06e1ca9aa7cafc063b2dcb17dde4a523da886abc23bf not found: ID does not exist" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.880762 4842 scope.go:117] "RemoveContainer" containerID="56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.881427 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: E0311 19:17:42.885492 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b\": container with ID starting with 56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b not found: ID does not exist" containerID="56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.885564 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b"} err="failed to get container status \"56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b\": rpc error: code = NotFound desc = could not find container \"56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b\": container with ID starting with 56bc16475a3f33564e66d57b237278916ff28f3de441ba23bf33acc24f223c2b not found: ID does not exist" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.888229 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.889404 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.892200 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.896510 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.952149 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3671efc-bed8-44b2-8663-60692f7a77a6-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.952235 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg9r6\" (UniqueName: \"kubernetes.io/projected/a3671efc-bed8-44b2-8663-60692f7a77a6-kube-api-access-bg9r6\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.952252 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3671efc-bed8-44b2-8663-60692f7a77a6-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.952303 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57a473-220e-4c5e-961c-7d5b738ced0f-logs\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.952526 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrm5l\" (UniqueName: \"kubernetes.io/projected/3c57a473-220e-4c5e-961c-7d5b738ced0f-kube-api-access-jrm5l\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.952594 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57a473-220e-4c5e-961c-7d5b738ced0f-config-data\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.972350 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836d17ca-fb8f-44fb-864b-064593e3eb90" path="/var/lib/kubelet/pods/836d17ca-fb8f-44fb-864b-064593e3eb90/volumes" Mar 11 19:17:42 crc kubenswrapper[4842]: I0311 19:17:42.973520 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24b3ca4-4a22-4937-93bd-8baad18bdf5e" path="/var/lib/kubelet/pods/f24b3ca4-4a22-4937-93bd-8baad18bdf5e/volumes" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.053639 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrm5l\" (UniqueName: \"kubernetes.io/projected/3c57a473-220e-4c5e-961c-7d5b738ced0f-kube-api-access-jrm5l\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.053681 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57a473-220e-4c5e-961c-7d5b738ced0f-config-data\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.053729 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3671efc-bed8-44b2-8663-60692f7a77a6-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.053807 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg9r6\" (UniqueName: \"kubernetes.io/projected/a3671efc-bed8-44b2-8663-60692f7a77a6-kube-api-access-bg9r6\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.053828 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3671efc-bed8-44b2-8663-60692f7a77a6-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.053866 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57a473-220e-4c5e-961c-7d5b738ced0f-logs\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.054336 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57a473-220e-4c5e-961c-7d5b738ced0f-logs\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.054364 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3671efc-bed8-44b2-8663-60692f7a77a6-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.057841 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3671efc-bed8-44b2-8663-60692f7a77a6-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.059025 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57a473-220e-4c5e-961c-7d5b738ced0f-config-data\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.075321 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrm5l\" (UniqueName: \"kubernetes.io/projected/3c57a473-220e-4c5e-961c-7d5b738ced0f-kube-api-access-jrm5l\") pod \"nova-kuttl-api-0\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.084855 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg9r6\" (UniqueName: \"kubernetes.io/projected/a3671efc-bed8-44b2-8663-60692f7a77a6-kube-api-access-bg9r6\") pod \"nova-kuttl-metadata-0\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.186404 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.207997 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.613520 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:17:43 crc kubenswrapper[4842]: W0311 19:17:43.632979 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c57a473_220e_4c5e_961c_7d5b738ced0f.slice/crio-a82fd9c81c082dc8e0837f08e0caa91a67d8f4b21ebd6975dc407f03fea997b6 WatchSource:0}: Error finding container a82fd9c81c082dc8e0837f08e0caa91a67d8f4b21ebd6975dc407f03fea997b6: Status 404 returned error can't find the container with id a82fd9c81c082dc8e0837f08e0caa91a67d8f4b21ebd6975dc407f03fea997b6 Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.720103 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.744623 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"a3671efc-bed8-44b2-8663-60692f7a77a6","Type":"ContainerStarted","Data":"08fcfc73d689df29bb08685af1c6e1136cc6eacd46b9e471e0948b96ef31e622"} Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.748734 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"3c57a473-220e-4c5e-961c-7d5b738ced0f","Type":"ContainerStarted","Data":"a82fd9c81c082dc8e0837f08e0caa91a67d8f4b21ebd6975dc407f03fea997b6"} Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.751238 4842 generic.go:334] "Generic (PLEG): container finished" podID="422727cb-d558-4e33-a86b-3c130d3206a2" containerID="55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352" exitCode=0 Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.751302 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"422727cb-d558-4e33-a86b-3c130d3206a2","Type":"ContainerDied","Data":"55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352"} Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.884513 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.965741 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ldj4\" (UniqueName: \"kubernetes.io/projected/422727cb-d558-4e33-a86b-3c130d3206a2-kube-api-access-8ldj4\") pod \"422727cb-d558-4e33-a86b-3c130d3206a2\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.965793 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/422727cb-d558-4e33-a86b-3c130d3206a2-config-data\") pod \"422727cb-d558-4e33-a86b-3c130d3206a2\" (UID: \"422727cb-d558-4e33-a86b-3c130d3206a2\") " Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.970446 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422727cb-d558-4e33-a86b-3c130d3206a2-kube-api-access-8ldj4" (OuterVolumeSpecName: "kube-api-access-8ldj4") pod "422727cb-d558-4e33-a86b-3c130d3206a2" (UID: "422727cb-d558-4e33-a86b-3c130d3206a2"). InnerVolumeSpecName "kube-api-access-8ldj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:17:43 crc kubenswrapper[4842]: I0311 19:17:43.993072 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/422727cb-d558-4e33-a86b-3c130d3206a2-config-data" (OuterVolumeSpecName: "config-data") pod "422727cb-d558-4e33-a86b-3c130d3206a2" (UID: "422727cb-d558-4e33-a86b-3c130d3206a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.067086 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ldj4\" (UniqueName: \"kubernetes.io/projected/422727cb-d558-4e33-a86b-3c130d3206a2-kube-api-access-8ldj4\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.068216 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/422727cb-d558-4e33-a86b-3c130d3206a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.764478 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"422727cb-d558-4e33-a86b-3c130d3206a2","Type":"ContainerDied","Data":"c3822d535a3b16fc45cdad2dbadd802d6824f49460d0991c50594ab48a9ec17b"} Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.764785 4842 scope.go:117] "RemoveContainer" containerID="55e2280b4d3390d548361929feb496cccfbfaf31d3c0f776e5bb21bab87c8352" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.764510 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.767652 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"a3671efc-bed8-44b2-8663-60692f7a77a6","Type":"ContainerStarted","Data":"688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c"} Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.767710 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"a3671efc-bed8-44b2-8663-60692f7a77a6","Type":"ContainerStarted","Data":"11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2"} Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.771662 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"3c57a473-220e-4c5e-961c-7d5b738ced0f","Type":"ContainerStarted","Data":"016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2"} Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.771711 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"3c57a473-220e-4c5e-961c-7d5b738ced0f","Type":"ContainerStarted","Data":"560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0"} Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.808429 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.808410099 podStartE2EDuration="2.808410099s" podCreationTimestamp="2026-03-11 19:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:44.79760706 +0000 UTC m=+1710.445303380" watchObservedRunningTime="2026-03-11 19:17:44.808410099 +0000 UTC m=+1710.456106379" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.818479 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.818462319 podStartE2EDuration="2.818462319s" podCreationTimestamp="2026-03-11 19:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:44.8162661 +0000 UTC m=+1710.463962390" watchObservedRunningTime="2026-03-11 19:17:44.818462319 +0000 UTC m=+1710.466158619" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.854621 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.865836 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.877991 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:44 crc kubenswrapper[4842]: E0311 19:17:44.878415 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422727cb-d558-4e33-a86b-3c130d3206a2" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.878435 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="422727cb-d558-4e33-a86b-3c130d3206a2" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.878626 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="422727cb-d558-4e33-a86b-3c130d3206a2" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.879343 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.881926 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.887565 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.973842 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="422727cb-d558-4e33-a86b-3c130d3206a2" path="/var/lib/kubelet/pods/422727cb-d558-4e33-a86b-3c130d3206a2/volumes" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.984685 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b42bf-877b-4b5a-b0a3-998aa208a41d-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:44 crc kubenswrapper[4842]: I0311 19:17:44.984922 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87kjf\" (UniqueName: \"kubernetes.io/projected/a39b42bf-877b-4b5a-b0a3-998aa208a41d-kube-api-access-87kjf\") pod \"nova-kuttl-scheduler-0\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:45 crc kubenswrapper[4842]: I0311 19:17:45.087397 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b42bf-877b-4b5a-b0a3-998aa208a41d-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:45 crc kubenswrapper[4842]: I0311 19:17:45.087512 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87kjf\" (UniqueName: \"kubernetes.io/projected/a39b42bf-877b-4b5a-b0a3-998aa208a41d-kube-api-access-87kjf\") pod \"nova-kuttl-scheduler-0\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:45 crc kubenswrapper[4842]: I0311 19:17:45.093599 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b42bf-877b-4b5a-b0a3-998aa208a41d-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:45 crc kubenswrapper[4842]: I0311 19:17:45.114452 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87kjf\" (UniqueName: \"kubernetes.io/projected/a39b42bf-877b-4b5a-b0a3-998aa208a41d-kube-api-access-87kjf\") pod \"nova-kuttl-scheduler-0\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:45 crc kubenswrapper[4842]: I0311 19:17:45.212639 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:45 crc kubenswrapper[4842]: W0311 19:17:45.657447 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39b42bf_877b_4b5a_b0a3_998aa208a41d.slice/crio-e326c85627a13d9d98ddcdb6c7c68f88a86c5eaffdd574fc74729219565f4bad WatchSource:0}: Error finding container e326c85627a13d9d98ddcdb6c7c68f88a86c5eaffdd574fc74729219565f4bad: Status 404 returned error can't find the container with id e326c85627a13d9d98ddcdb6c7c68f88a86c5eaffdd574fc74729219565f4bad Mar 11 19:17:45 crc kubenswrapper[4842]: I0311 19:17:45.658075 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:17:45 crc kubenswrapper[4842]: I0311 19:17:45.779252 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"a39b42bf-877b-4b5a-b0a3-998aa208a41d","Type":"ContainerStarted","Data":"e326c85627a13d9d98ddcdb6c7c68f88a86c5eaffdd574fc74729219565f4bad"} Mar 11 19:17:46 crc kubenswrapper[4842]: I0311 19:17:46.792164 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"a39b42bf-877b-4b5a-b0a3-998aa208a41d","Type":"ContainerStarted","Data":"c3ad2425978b65f6439feb572682b513835f76760e6cefca16113e56c23118f3"} Mar 11 19:17:46 crc kubenswrapper[4842]: I0311 19:17:46.816946 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.816928748 podStartE2EDuration="2.816928748s" podCreationTimestamp="2026-03-11 19:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:17:46.809957281 +0000 UTC m=+1712.457653591" watchObservedRunningTime="2026-03-11 19:17:46.816928748 +0000 UTC m=+1712.464625038" Mar 11 19:17:50 crc kubenswrapper[4842]: I0311 19:17:50.212919 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:52 crc kubenswrapper[4842]: I0311 19:17:52.962750 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:17:52 crc kubenswrapper[4842]: E0311 19:17:52.963193 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:17:53 crc kubenswrapper[4842]: I0311 19:17:53.187498 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:53 crc kubenswrapper[4842]: I0311 19:17:53.187906 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:17:53 crc kubenswrapper[4842]: I0311 19:17:53.209740 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:53 crc kubenswrapper[4842]: I0311 19:17:53.209849 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:17:54 crc kubenswrapper[4842]: I0311 19:17:54.228440 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:54 crc kubenswrapper[4842]: I0311 19:17:54.269480 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:54 crc kubenswrapper[4842]: I0311 19:17:54.352545 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.216:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:54 crc kubenswrapper[4842]: I0311 19:17:54.352813 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.216:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:17:55 crc kubenswrapper[4842]: I0311 19:17:55.213313 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:55 crc kubenswrapper[4842]: I0311 19:17:55.239907 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:17:55 crc kubenswrapper[4842]: I0311 19:17:55.928238 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.162742 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554278-hgntf"] Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.165678 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554278-hgntf" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.169816 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.170373 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.170395 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.186820 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554278-hgntf"] Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.260536 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-955cf\" (UniqueName: \"kubernetes.io/projected/636db3de-b9c6-43f3-8897-39ce019bd74e-kube-api-access-955cf\") pod \"auto-csr-approver-29554278-hgntf\" (UID: \"636db3de-b9c6-43f3-8897-39ce019bd74e\") " pod="openshift-infra/auto-csr-approver-29554278-hgntf" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.363388 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-955cf\" (UniqueName: \"kubernetes.io/projected/636db3de-b9c6-43f3-8897-39ce019bd74e-kube-api-access-955cf\") pod \"auto-csr-approver-29554278-hgntf\" (UID: \"636db3de-b9c6-43f3-8897-39ce019bd74e\") " pod="openshift-infra/auto-csr-approver-29554278-hgntf" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.389348 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-955cf\" (UniqueName: \"kubernetes.io/projected/636db3de-b9c6-43f3-8897-39ce019bd74e-kube-api-access-955cf\") pod \"auto-csr-approver-29554278-hgntf\" (UID: \"636db3de-b9c6-43f3-8897-39ce019bd74e\") " pod="openshift-infra/auto-csr-approver-29554278-hgntf" Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.495223 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554278-hgntf" Mar 11 19:18:00 crc kubenswrapper[4842]: W0311 19:18:00.973516 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636db3de_b9c6_43f3_8897_39ce019bd74e.slice/crio-07470ea0985567d880479dd256980b85c71647b01f628ebeedfb9c0523783283 WatchSource:0}: Error finding container 07470ea0985567d880479dd256980b85c71647b01f628ebeedfb9c0523783283: Status 404 returned error can't find the container with id 07470ea0985567d880479dd256980b85c71647b01f628ebeedfb9c0523783283 Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.976995 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554278-hgntf"] Mar 11 19:18:00 crc kubenswrapper[4842]: I0311 19:18:00.977900 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 19:18:01 crc kubenswrapper[4842]: I0311 19:18:01.187295 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:18:01 crc kubenswrapper[4842]: I0311 19:18:01.187568 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:18:01 crc kubenswrapper[4842]: I0311 19:18:01.209220 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:18:01 crc kubenswrapper[4842]: I0311 19:18:01.209318 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:18:01 crc kubenswrapper[4842]: I0311 19:18:01.977498 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554278-hgntf" event={"ID":"636db3de-b9c6-43f3-8897-39ce019bd74e","Type":"ContainerStarted","Data":"07470ea0985567d880479dd256980b85c71647b01f628ebeedfb9c0523783283"} Mar 11 19:18:02 crc kubenswrapper[4842]: I0311 19:18:02.989334 4842 generic.go:334] "Generic (PLEG): container finished" podID="636db3de-b9c6-43f3-8897-39ce019bd74e" containerID="92401beb2ec2ceb19a76e81eebd663897e78135b94a6090782898b4dd81c5018" exitCode=0 Mar 11 19:18:02 crc kubenswrapper[4842]: I0311 19:18:02.989416 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554278-hgntf" event={"ID":"636db3de-b9c6-43f3-8897-39ce019bd74e","Type":"ContainerDied","Data":"92401beb2ec2ceb19a76e81eebd663897e78135b94a6090782898b4dd81c5018"} Mar 11 19:18:03 crc kubenswrapper[4842]: I0311 19:18:03.193104 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:18:03 crc kubenswrapper[4842]: I0311 19:18:03.193725 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:18:03 crc kubenswrapper[4842]: I0311 19:18:03.198949 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:18:03 crc kubenswrapper[4842]: I0311 19:18:03.199658 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:18:03 crc kubenswrapper[4842]: I0311 19:18:03.211458 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:18:03 crc kubenswrapper[4842]: I0311 19:18:03.215309 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:18:03 crc kubenswrapper[4842]: I0311 19:18:03.216011 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:18:04 crc kubenswrapper[4842]: I0311 19:18:04.004944 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:18:04 crc kubenswrapper[4842]: I0311 19:18:04.366673 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554278-hgntf" Mar 11 19:18:04 crc kubenswrapper[4842]: I0311 19:18:04.438143 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-955cf\" (UniqueName: \"kubernetes.io/projected/636db3de-b9c6-43f3-8897-39ce019bd74e-kube-api-access-955cf\") pod \"636db3de-b9c6-43f3-8897-39ce019bd74e\" (UID: \"636db3de-b9c6-43f3-8897-39ce019bd74e\") " Mar 11 19:18:04 crc kubenswrapper[4842]: I0311 19:18:04.445179 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636db3de-b9c6-43f3-8897-39ce019bd74e-kube-api-access-955cf" (OuterVolumeSpecName: "kube-api-access-955cf") pod "636db3de-b9c6-43f3-8897-39ce019bd74e" (UID: "636db3de-b9c6-43f3-8897-39ce019bd74e"). InnerVolumeSpecName "kube-api-access-955cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:04 crc kubenswrapper[4842]: I0311 19:18:04.540651 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-955cf\" (UniqueName: \"kubernetes.io/projected/636db3de-b9c6-43f3-8897-39ce019bd74e-kube-api-access-955cf\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.008756 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554278-hgntf" event={"ID":"636db3de-b9c6-43f3-8897-39ce019bd74e","Type":"ContainerDied","Data":"07470ea0985567d880479dd256980b85c71647b01f628ebeedfb9c0523783283"} Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.008811 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07470ea0985567d880479dd256980b85c71647b01f628ebeedfb9c0523783283" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.008938 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554278-hgntf" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.451261 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554272-glgns"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.460598 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554272-glgns"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.585202 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-1"] Mar 11 19:18:05 crc kubenswrapper[4842]: E0311 19:18:05.585778 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636db3de-b9c6-43f3-8897-39ce019bd74e" containerName="oc" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.585804 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="636db3de-b9c6-43f3-8897-39ce019bd74e" containerName="oc" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.586077 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="636db3de-b9c6-43f3-8897-39ce019bd74e" containerName="oc" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.587158 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.591286 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-2"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.592542 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.604998 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-2"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.612645 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-1"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.655665 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jj8m\" (UniqueName: \"kubernetes.io/projected/f3a0491e-8184-4312-a283-91f394d597ff-kube-api-access-2jj8m\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.655706 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-config-data\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.655754 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a0491e-8184-4312-a283-91f394d597ff-logs\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.655788 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a0491e-8184-4312-a283-91f394d597ff-config-data\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.655874 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gln\" (UniqueName: \"kubernetes.io/projected/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-kube-api-access-28gln\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.655918 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-logs\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.757884 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jj8m\" (UniqueName: \"kubernetes.io/projected/f3a0491e-8184-4312-a283-91f394d597ff-kube-api-access-2jj8m\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.757929 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-config-data\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.757970 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a0491e-8184-4312-a283-91f394d597ff-logs\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.758021 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a0491e-8184-4312-a283-91f394d597ff-config-data\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.758506 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a0491e-8184-4312-a283-91f394d597ff-logs\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.758551 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gln\" (UniqueName: \"kubernetes.io/projected/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-kube-api-access-28gln\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.758585 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-logs\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.759080 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-logs\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.762717 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-config-data\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.769357 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a0491e-8184-4312-a283-91f394d597ff-config-data\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.774308 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gln\" (UniqueName: \"kubernetes.io/projected/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-kube-api-access-28gln\") pod \"nova-kuttl-api-1\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.782766 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jj8m\" (UniqueName: \"kubernetes.io/projected/f3a0491e-8184-4312-a283-91f394d597ff-kube-api-access-2jj8m\") pod \"nova-kuttl-api-2\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.880539 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-1"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.881683 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.907518 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-2"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.908674 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.918042 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.923882 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-2"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.933804 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-1"] Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.941046 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.962291 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4llrv\" (UniqueName: \"kubernetes.io/projected/8cffefc0-8682-44f3-8b07-6d766905faf6-kube-api-access-4llrv\") pod \"nova-kuttl-cell0-conductor-1\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.962360 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvb2m\" (UniqueName: \"kubernetes.io/projected/241ab90a-71bc-4e09-a4e3-e620a090cdbf-kube-api-access-bvb2m\") pod \"nova-kuttl-cell0-conductor-2\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.962390 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241ab90a-71bc-4e09-a4e3-e620a090cdbf-config-data\") pod \"nova-kuttl-cell0-conductor-2\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:05 crc kubenswrapper[4842]: I0311 19:18:05.962448 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cffefc0-8682-44f3-8b07-6d766905faf6-config-data\") pod \"nova-kuttl-cell0-conductor-1\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.063909 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4llrv\" (UniqueName: \"kubernetes.io/projected/8cffefc0-8682-44f3-8b07-6d766905faf6-kube-api-access-4llrv\") pod \"nova-kuttl-cell0-conductor-1\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.064311 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvb2m\" (UniqueName: \"kubernetes.io/projected/241ab90a-71bc-4e09-a4e3-e620a090cdbf-kube-api-access-bvb2m\") pod \"nova-kuttl-cell0-conductor-2\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.064344 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241ab90a-71bc-4e09-a4e3-e620a090cdbf-config-data\") pod \"nova-kuttl-cell0-conductor-2\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.064420 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cffefc0-8682-44f3-8b07-6d766905faf6-config-data\") pod \"nova-kuttl-cell0-conductor-1\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.069919 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241ab90a-71bc-4e09-a4e3-e620a090cdbf-config-data\") pod \"nova-kuttl-cell0-conductor-2\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.070423 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cffefc0-8682-44f3-8b07-6d766905faf6-config-data\") pod \"nova-kuttl-cell0-conductor-1\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.085796 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4llrv\" (UniqueName: \"kubernetes.io/projected/8cffefc0-8682-44f3-8b07-6d766905faf6-kube-api-access-4llrv\") pod \"nova-kuttl-cell0-conductor-1\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.089015 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvb2m\" (UniqueName: \"kubernetes.io/projected/241ab90a-71bc-4e09-a4e3-e620a090cdbf-kube-api-access-bvb2m\") pod \"nova-kuttl-cell0-conductor-2\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.205747 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.229061 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.416235 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-1"] Mar 11 19:18:06 crc kubenswrapper[4842]: W0311 19:18:06.418927 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7fcfc70_0a88_4099_869c_aed4a16dc1a3.slice/crio-66a781bf318816afc880bc99c923a44225ada67276c2f2d07d7833684d78937d WatchSource:0}: Error finding container 66a781bf318816afc880bc99c923a44225ada67276c2f2d07d7833684d78937d: Status 404 returned error can't find the container with id 66a781bf318816afc880bc99c923a44225ada67276c2f2d07d7833684d78937d Mar 11 19:18:06 crc kubenswrapper[4842]: W0311 19:18:06.464091 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3a0491e_8184_4312_a283_91f394d597ff.slice/crio-25c8fc07ae90aca740fedb00583ec4792963927273a2d232db86ce0fd2a97f82 WatchSource:0}: Error finding container 25c8fc07ae90aca740fedb00583ec4792963927273a2d232db86ce0fd2a97f82: Status 404 returned error can't find the container with id 25c8fc07ae90aca740fedb00583ec4792963927273a2d232db86ce0fd2a97f82 Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.464838 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-2"] Mar 11 19:18:06 crc kubenswrapper[4842]: W0311 19:18:06.711028 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241ab90a_71bc_4e09_a4e3_e620a090cdbf.slice/crio-9e180029f3feca34c8f0094bc6e85ee39cfcce62a8f1de122ab66fda27dd7e63 WatchSource:0}: Error finding container 9e180029f3feca34c8f0094bc6e85ee39cfcce62a8f1de122ab66fda27dd7e63: Status 404 returned error can't find the container with id 9e180029f3feca34c8f0094bc6e85ee39cfcce62a8f1de122ab66fda27dd7e63 Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.713130 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-2"] Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.769808 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-1"] Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.963508 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:18:06 crc kubenswrapper[4842]: E0311 19:18:06.963807 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:18:06 crc kubenswrapper[4842]: I0311 19:18:06.973383 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b2ac53-6114-4e08-9757-e28296a29695" path="/var/lib/kubelet/pods/f4b2ac53-6114-4e08-9757-e28296a29695/volumes" Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.039565 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-2" event={"ID":"f3a0491e-8184-4312-a283-91f394d597ff","Type":"ContainerStarted","Data":"46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.039634 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-2" event={"ID":"f3a0491e-8184-4312-a283-91f394d597ff","Type":"ContainerStarted","Data":"bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.039650 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-2" event={"ID":"f3a0491e-8184-4312-a283-91f394d597ff","Type":"ContainerStarted","Data":"25c8fc07ae90aca740fedb00583ec4792963927273a2d232db86ce0fd2a97f82"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.042295 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" event={"ID":"8cffefc0-8682-44f3-8b07-6d766905faf6","Type":"ContainerStarted","Data":"6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.042331 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" event={"ID":"8cffefc0-8682-44f3-8b07-6d766905faf6","Type":"ContainerStarted","Data":"5482b0d5b5f502eba5e85e66b2fae5ad39da29738cbeb18d6bd24929947ddd79"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.042799 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.047260 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" event={"ID":"241ab90a-71bc-4e09-a4e3-e620a090cdbf","Type":"ContainerStarted","Data":"7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.047418 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" event={"ID":"241ab90a-71bc-4e09-a4e3-e620a090cdbf","Type":"ContainerStarted","Data":"9e180029f3feca34c8f0094bc6e85ee39cfcce62a8f1de122ab66fda27dd7e63"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.048219 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.051818 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-1" event={"ID":"e7fcfc70-0a88-4099-869c-aed4a16dc1a3","Type":"ContainerStarted","Data":"042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.051895 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-1" event={"ID":"e7fcfc70-0a88-4099-869c-aed4a16dc1a3","Type":"ContainerStarted","Data":"2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.051917 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-1" event={"ID":"e7fcfc70-0a88-4099-869c-aed4a16dc1a3","Type":"ContainerStarted","Data":"66a781bf318816afc880bc99c923a44225ada67276c2f2d07d7833684d78937d"} Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.066739 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-2" podStartSLOduration=2.066711641 podStartE2EDuration="2.066711641s" podCreationTimestamp="2026-03-11 19:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:07.061521662 +0000 UTC m=+1732.709217962" watchObservedRunningTime="2026-03-11 19:18:07.066711641 +0000 UTC m=+1732.714407921" Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.089547 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-1" podStartSLOduration=2.089517472 podStartE2EDuration="2.089517472s" podCreationTimestamp="2026-03-11 19:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:07.076875573 +0000 UTC m=+1732.724571843" watchObservedRunningTime="2026-03-11 19:18:07.089517472 +0000 UTC m=+1732.737213752" Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.105526 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" podStartSLOduration=2.10549882 podStartE2EDuration="2.10549882s" podCreationTimestamp="2026-03-11 19:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:07.093330154 +0000 UTC m=+1732.741026434" watchObservedRunningTime="2026-03-11 19:18:07.10549882 +0000 UTC m=+1732.753195100" Mar 11 19:18:07 crc kubenswrapper[4842]: I0311 19:18:07.117001 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" podStartSLOduration=2.116976978 podStartE2EDuration="2.116976978s" podCreationTimestamp="2026-03-11 19:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:07.107889194 +0000 UTC m=+1732.755585474" watchObservedRunningTime="2026-03-11 19:18:07.116976978 +0000 UTC m=+1732.764673258" Mar 11 19:18:11 crc kubenswrapper[4842]: I0311 19:18:11.231556 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:15 crc kubenswrapper[4842]: I0311 19:18:15.918783 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:15 crc kubenswrapper[4842]: I0311 19:18:15.920397 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:15 crc kubenswrapper[4842]: I0311 19:18:15.943406 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:15 crc kubenswrapper[4842]: I0311 19:18:15.943507 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:16 crc kubenswrapper[4842]: I0311 19:18:16.264108 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.082684 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-1" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.082767 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-2" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.082686 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-1" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.082692 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-2" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.575038 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-2"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.577480 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.588009 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-1"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.590006 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.608727 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-2"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.619014 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-1"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.665667 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-config-data\") pod \"nova-kuttl-scheduler-2\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.665897 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea0f15c-863a-46b4-9a4f-42df55730e40-config-data\") pod \"nova-kuttl-scheduler-1\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.665982 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7v8\" (UniqueName: \"kubernetes.io/projected/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-kube-api-access-tr7v8\") pod \"nova-kuttl-scheduler-2\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.666094 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjskn\" (UniqueName: \"kubernetes.io/projected/4ea0f15c-863a-46b4-9a4f-42df55730e40-kube-api-access-bjskn\") pod \"nova-kuttl-scheduler-1\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.679536 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-1"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.680878 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.695319 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-2"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.697079 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.716415 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-1"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.736686 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-2"] Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771699 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjx8\" (UniqueName: \"kubernetes.io/projected/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-kube-api-access-fpjx8\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771749 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d486n\" (UniqueName: \"kubernetes.io/projected/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-kube-api-access-d486n\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771788 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea0f15c-863a-46b4-9a4f-42df55730e40-config-data\") pod \"nova-kuttl-scheduler-1\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771824 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-config-data\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771853 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7v8\" (UniqueName: \"kubernetes.io/projected/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-kube-api-access-tr7v8\") pod \"nova-kuttl-scheduler-2\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771894 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjskn\" (UniqueName: \"kubernetes.io/projected/4ea0f15c-863a-46b4-9a4f-42df55730e40-kube-api-access-bjskn\") pod \"nova-kuttl-scheduler-1\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771923 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-config-data\") pod \"nova-kuttl-scheduler-2\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771948 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-logs\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.771990 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-config-data\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.772007 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-logs\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.789740 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-config-data\") pod \"nova-kuttl-scheduler-2\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.791253 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7v8\" (UniqueName: \"kubernetes.io/projected/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-kube-api-access-tr7v8\") pod \"nova-kuttl-scheduler-2\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.797659 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea0f15c-863a-46b4-9a4f-42df55730e40-config-data\") pod \"nova-kuttl-scheduler-1\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.801107 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjskn\" (UniqueName: \"kubernetes.io/projected/4ea0f15c-863a-46b4-9a4f-42df55730e40-kube-api-access-bjskn\") pod \"nova-kuttl-scheduler-1\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.873897 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjx8\" (UniqueName: \"kubernetes.io/projected/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-kube-api-access-fpjx8\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.873953 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d486n\" (UniqueName: \"kubernetes.io/projected/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-kube-api-access-d486n\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.873997 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-config-data\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.874059 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-logs\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.874112 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-config-data\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.874135 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-logs\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.874643 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-logs\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.875079 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-logs\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.879849 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-config-data\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.880811 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-config-data\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.890317 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjx8\" (UniqueName: \"kubernetes.io/projected/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-kube-api-access-fpjx8\") pod \"nova-kuttl-metadata-2\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.891224 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d486n\" (UniqueName: \"kubernetes.io/projected/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-kube-api-access-d486n\") pod \"nova-kuttl-metadata-1\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.892711 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:17 crc kubenswrapper[4842]: I0311 19:18:17.909401 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.005884 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.027671 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.456091 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-2"] Mar 11 19:18:18 crc kubenswrapper[4842]: W0311 19:18:18.465300 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b485bd9_dd54_4ba8_b27f_ceda50b858f8.slice/crio-87c8090253e638a7ed3915cc6947133db35efd0ee991fd919725168a89d6bd08 WatchSource:0}: Error finding container 87c8090253e638a7ed3915cc6947133db35efd0ee991fd919725168a89d6bd08: Status 404 returned error can't find the container with id 87c8090253e638a7ed3915cc6947133db35efd0ee991fd919725168a89d6bd08 Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.590702 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-1"] Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.709903 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-2"] Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.721078 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-1"] Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.846446 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-1"] Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.857374 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.868341 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-2"] Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.869969 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.880134 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-2"] Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.887420 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-1"] Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.892094 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502dc472-1dee-4d14-97a3-38494f63d086-config-data\") pod \"nova-kuttl-cell1-conductor-1\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.892144 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jssq\" (UniqueName: \"kubernetes.io/projected/502dc472-1dee-4d14-97a3-38494f63d086-kube-api-access-2jssq\") pod \"nova-kuttl-cell1-conductor-1\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.962371 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:18:18 crc kubenswrapper[4842]: E0311 19:18:18.962661 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.993315 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502dc472-1dee-4d14-97a3-38494f63d086-config-data\") pod \"nova-kuttl-cell1-conductor-1\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.995008 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jssq\" (UniqueName: \"kubernetes.io/projected/502dc472-1dee-4d14-97a3-38494f63d086-kube-api-access-2jssq\") pod \"nova-kuttl-cell1-conductor-1\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.995177 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1335c672-48b7-46e6-a70e-eb54e14ce800-config-data\") pod \"nova-kuttl-cell1-conductor-2\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:18 crc kubenswrapper[4842]: I0311 19:18:18.995287 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8l8j\" (UniqueName: \"kubernetes.io/projected/1335c672-48b7-46e6-a70e-eb54e14ce800-kube-api-access-c8l8j\") pod \"nova-kuttl-cell1-conductor-2\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.003635 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502dc472-1dee-4d14-97a3-38494f63d086-config-data\") pod \"nova-kuttl-cell1-conductor-1\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.014523 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jssq\" (UniqueName: \"kubernetes.io/projected/502dc472-1dee-4d14-97a3-38494f63d086-kube-api-access-2jssq\") pod \"nova-kuttl-cell1-conductor-1\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.097211 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1335c672-48b7-46e6-a70e-eb54e14ce800-config-data\") pod \"nova-kuttl-cell1-conductor-2\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.097340 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8l8j\" (UniqueName: \"kubernetes.io/projected/1335c672-48b7-46e6-a70e-eb54e14ce800-kube-api-access-c8l8j\") pod \"nova-kuttl-cell1-conductor-2\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.100972 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1335c672-48b7-46e6-a70e-eb54e14ce800-config-data\") pod \"nova-kuttl-cell1-conductor-2\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.118933 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8l8j\" (UniqueName: \"kubernetes.io/projected/1335c672-48b7-46e6-a70e-eb54e14ce800-kube-api-access-c8l8j\") pod \"nova-kuttl-cell1-conductor-2\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.171119 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-2" event={"ID":"0b485bd9-dd54-4ba8-b27f-ceda50b858f8","Type":"ContainerStarted","Data":"ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.171165 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-2" event={"ID":"0b485bd9-dd54-4ba8-b27f-ceda50b858f8","Type":"ContainerStarted","Data":"87c8090253e638a7ed3915cc6947133db35efd0ee991fd919725168a89d6bd08"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.173095 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-1" event={"ID":"4ea0f15c-863a-46b4-9a4f-42df55730e40","Type":"ContainerStarted","Data":"abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.173177 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-1" event={"ID":"4ea0f15c-863a-46b4-9a4f-42df55730e40","Type":"ContainerStarted","Data":"b4bcd227d8218dbe9f302f595aa953418a06c2159aa14dbcc22309c1eaf50501"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.176503 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-2" event={"ID":"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961","Type":"ContainerStarted","Data":"8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.176535 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-2" event={"ID":"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961","Type":"ContainerStarted","Data":"d252eefd8f737d9cd965cd708b5df805069970708286edcdbc5e3da27a6ee93c"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.183931 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-1" event={"ID":"fc95ee94-67ba-4093-a29d-846ab4c1d6c0","Type":"ContainerStarted","Data":"d2865305926eb583aa015c3bc2be7dd7853774c055cc2c04626a5c1e181dea17"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.183973 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-1" event={"ID":"fc95ee94-67ba-4093-a29d-846ab4c1d6c0","Type":"ContainerStarted","Data":"962855932e5a2b95e659e50bbcc99c5998bbeb49d37db0b3495738db334a69b1"} Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.201817 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-2" podStartSLOduration=2.201793783 podStartE2EDuration="2.201793783s" podCreationTimestamp="2026-03-11 19:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:19.19197234 +0000 UTC m=+1744.839668640" watchObservedRunningTime="2026-03-11 19:18:19.201793783 +0000 UTC m=+1744.849490063" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.271920 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.290082 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.694518 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-1" podStartSLOduration=2.6945032749999998 podStartE2EDuration="2.694503275s" podCreationTimestamp="2026-03-11 19:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:19.213340152 +0000 UTC m=+1744.861036442" watchObservedRunningTime="2026-03-11 19:18:19.694503275 +0000 UTC m=+1745.342199555" Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.699965 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-1"] Mar 11 19:18:19 crc kubenswrapper[4842]: I0311 19:18:19.863563 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-2"] Mar 11 19:18:19 crc kubenswrapper[4842]: W0311 19:18:19.870085 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1335c672_48b7_46e6_a70e_eb54e14ce800.slice/crio-a7cfa960a911c714077274b760a6cfe2885dfe31e66fc0cc3de691927d52ac5e WatchSource:0}: Error finding container a7cfa960a911c714077274b760a6cfe2885dfe31e66fc0cc3de691927d52ac5e: Status 404 returned error can't find the container with id a7cfa960a911c714077274b760a6cfe2885dfe31e66fc0cc3de691927d52ac5e Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.198666 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-2" event={"ID":"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961","Type":"ContainerStarted","Data":"ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414"} Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.206383 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-1" event={"ID":"fc95ee94-67ba-4093-a29d-846ab4c1d6c0","Type":"ContainerStarted","Data":"ec2ae559ee2a537cd01bf3df65efbfa656b17172c8d7754c869d37954dd4734d"} Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.211238 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" event={"ID":"1335c672-48b7-46e6-a70e-eb54e14ce800","Type":"ContainerStarted","Data":"9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a"} Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.211314 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" event={"ID":"1335c672-48b7-46e6-a70e-eb54e14ce800","Type":"ContainerStarted","Data":"a7cfa960a911c714077274b760a6cfe2885dfe31e66fc0cc3de691927d52ac5e"} Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.211510 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.214121 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" event={"ID":"502dc472-1dee-4d14-97a3-38494f63d086","Type":"ContainerStarted","Data":"30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155"} Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.214246 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" event={"ID":"502dc472-1dee-4d14-97a3-38494f63d086","Type":"ContainerStarted","Data":"bf9b6d996d17956edfb1577866f524199996bac0d0305d7199c3bf332e251e48"} Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.220457 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-2" podStartSLOduration=3.220436188 podStartE2EDuration="3.220436188s" podCreationTimestamp="2026-03-11 19:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:20.217239342 +0000 UTC m=+1745.864935622" watchObservedRunningTime="2026-03-11 19:18:20.220436188 +0000 UTC m=+1745.868132468" Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.240843 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" podStartSLOduration=2.240819114 podStartE2EDuration="2.240819114s" podCreationTimestamp="2026-03-11 19:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:20.237983948 +0000 UTC m=+1745.885680258" watchObservedRunningTime="2026-03-11 19:18:20.240819114 +0000 UTC m=+1745.888515394" Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.277657 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-1" podStartSLOduration=3.277638391 podStartE2EDuration="3.277638391s" podCreationTimestamp="2026-03-11 19:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:20.261138879 +0000 UTC m=+1745.908835159" watchObservedRunningTime="2026-03-11 19:18:20.277638391 +0000 UTC m=+1745.925334671" Mar 11 19:18:20 crc kubenswrapper[4842]: I0311 19:18:20.305960 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" podStartSLOduration=2.305923859 podStartE2EDuration="2.305923859s" podCreationTimestamp="2026-03-11 19:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:18:20.297843942 +0000 UTC m=+1745.945540222" watchObservedRunningTime="2026-03-11 19:18:20.305923859 +0000 UTC m=+1745.953620149" Mar 11 19:18:21 crc kubenswrapper[4842]: I0311 19:18:21.227988 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:22 crc kubenswrapper[4842]: I0311 19:18:22.893505 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:22 crc kubenswrapper[4842]: I0311 19:18:22.909939 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:23 crc kubenswrapper[4842]: I0311 19:18:23.918624 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:23 crc kubenswrapper[4842]: I0311 19:18:23.918711 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:23 crc kubenswrapper[4842]: I0311 19:18:23.942510 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:23 crc kubenswrapper[4842]: I0311 19:18:23.942601 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:24 crc kubenswrapper[4842]: I0311 19:18:24.314684 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:25 crc kubenswrapper[4842]: I0311 19:18:25.922433 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:25 crc kubenswrapper[4842]: I0311 19:18:25.923753 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:25 crc kubenswrapper[4842]: I0311 19:18:25.925988 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:25 crc kubenswrapper[4842]: I0311 19:18:25.951014 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:25 crc kubenswrapper[4842]: I0311 19:18:25.954943 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:25 crc kubenswrapper[4842]: I0311 19:18:25.960446 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:26 crc kubenswrapper[4842]: I0311 19:18:26.284367 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:26 crc kubenswrapper[4842]: I0311 19:18:26.284456 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:27 crc kubenswrapper[4842]: I0311 19:18:27.893184 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:27 crc kubenswrapper[4842]: I0311 19:18:27.910033 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:27 crc kubenswrapper[4842]: I0311 19:18:27.922927 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:27 crc kubenswrapper[4842]: I0311 19:18:27.936140 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.007231 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.007325 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.028454 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.028522 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.361343 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.375074 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.507983 4842 scope.go:117] "RemoveContainer" containerID="a63c3238bfac46b9c23626d6ade9d53415f70c92a5e0fee91287df2d9fc57637" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.551038 4842 scope.go:117] "RemoveContainer" containerID="edd7b95b78f42abd379e320016674b340cbca802d22798680f08d41a376a0564" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.634777 4842 scope.go:117] "RemoveContainer" containerID="4e686365a3ab688a6aab2ab926145acecb1fa50309580ebea4d78a1a4eb0d505" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.705362 4842 scope.go:117] "RemoveContainer" containerID="b48db79b04ca4e83ed26998ec2e6a74da2495699c80c59933de44a3b7f18ab67" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.741999 4842 scope.go:117] "RemoveContainer" containerID="657aec834e4d5c4a396ecde7752aeb66b8f6a57912102f54ece9a6d9b736589a" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.771188 4842 scope.go:117] "RemoveContainer" containerID="5b7f363886c60e12a093cfa74d7e5d9a0d20de1fab9acfb0f8360ffd24fe569a" Mar 11 19:18:28 crc kubenswrapper[4842]: I0311 19:18:28.811398 4842 scope.go:117] "RemoveContainer" containerID="53b16f0eb12ca94a03d896c9d4d12003a1b24c2fe96715ffa4b8e7263c1ec4e1" Mar 11 19:18:29 crc kubenswrapper[4842]: I0311 19:18:29.057071 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-1" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.225:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:29 crc kubenswrapper[4842]: I0311 19:18:29.097460 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-1" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.225:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:29 crc kubenswrapper[4842]: I0311 19:18:29.179654 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-2" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.226:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:29 crc kubenswrapper[4842]: I0311 19:18:29.179697 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-2" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.226:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:18:29 crc kubenswrapper[4842]: I0311 19:18:29.320203 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:29 crc kubenswrapper[4842]: I0311 19:18:29.962260 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:18:29 crc kubenswrapper[4842]: E0311 19:18:29.962541 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:18:36 crc kubenswrapper[4842]: I0311 19:18:36.007422 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:36 crc kubenswrapper[4842]: I0311 19:18:36.009125 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:36 crc kubenswrapper[4842]: I0311 19:18:36.028536 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:36 crc kubenswrapper[4842]: I0311 19:18:36.028593 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.008972 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.010145 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.013380 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.036569 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.036670 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.049117 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.049596 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:38 crc kubenswrapper[4842]: I0311 19:18:38.440599 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.268842 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-2"] Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.269088 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-2" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-log" containerID="cri-o://bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a" gracePeriod=30 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.269261 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-2" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-api" containerID="cri-o://46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315" gracePeriod=30 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.291171 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-1"] Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.291760 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-1" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-log" containerID="cri-o://2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d" gracePeriod=30 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.292170 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-1" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-api" containerID="cri-o://042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442" gracePeriod=30 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.440509 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-2"] Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.440763 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" podUID="241ab90a-71bc-4e09-a4e3-e620a090cdbf" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" gracePeriod=30 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.453220 4842 generic.go:334] "Generic (PLEG): container finished" podID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerID="2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d" exitCode=143 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.453309 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-1" event={"ID":"e7fcfc70-0a88-4099-869c-aed4a16dc1a3","Type":"ContainerDied","Data":"2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d"} Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.453796 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-1"] Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.453956 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" podUID="8cffefc0-8682-44f3-8b07-6d766905faf6" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" gracePeriod=30 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.456872 4842 generic.go:334] "Generic (PLEG): container finished" podID="f3a0491e-8184-4312-a283-91f394d597ff" containerID="bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a" exitCode=143 Mar 11 19:18:39 crc kubenswrapper[4842]: I0311 19:18:39.456936 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-2" event={"ID":"f3a0491e-8184-4312-a283-91f394d597ff","Type":"ContainerDied","Data":"bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a"} Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.209565 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.212206 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.213669 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.213705 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" podUID="8cffefc0-8682-44f3-8b07-6d766905faf6" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.232030 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.233648 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.236770 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:41 crc kubenswrapper[4842]: E0311 19:18:41.236800 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" podUID="241ab90a-71bc-4e09-a4e3-e620a090cdbf" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:18:42 crc kubenswrapper[4842]: I0311 19:18:42.462033 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-2" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": read tcp 10.217.0.2:58770->10.217.0.220:8774: read: connection reset by peer" Mar 11 19:18:42 crc kubenswrapper[4842]: I0311 19:18:42.462057 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-2" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": read tcp 10.217.0.2:58780->10.217.0.220:8774: read: connection reset by peer" Mar 11 19:18:42 crc kubenswrapper[4842]: I0311 19:18:42.476259 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-1" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": read tcp 10.217.0.2:59040->10.217.0.219:8774: read: connection reset by peer" Mar 11 19:18:42 crc kubenswrapper[4842]: I0311 19:18:42.476425 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-1" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.219:8774/\": read tcp 10.217.0.2:59032->10.217.0.219:8774: read: connection reset by peer" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.077108 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.085661 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.272909 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-logs\") pod \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.273003 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a0491e-8184-4312-a283-91f394d597ff-config-data\") pod \"f3a0491e-8184-4312-a283-91f394d597ff\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.273046 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jj8m\" (UniqueName: \"kubernetes.io/projected/f3a0491e-8184-4312-a283-91f394d597ff-kube-api-access-2jj8m\") pod \"f3a0491e-8184-4312-a283-91f394d597ff\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.273108 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a0491e-8184-4312-a283-91f394d597ff-logs\") pod \"f3a0491e-8184-4312-a283-91f394d597ff\" (UID: \"f3a0491e-8184-4312-a283-91f394d597ff\") " Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.273161 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28gln\" (UniqueName: \"kubernetes.io/projected/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-kube-api-access-28gln\") pod \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.273260 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-config-data\") pod \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\" (UID: \"e7fcfc70-0a88-4099-869c-aed4a16dc1a3\") " Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.273639 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-logs" (OuterVolumeSpecName: "logs") pod "e7fcfc70-0a88-4099-869c-aed4a16dc1a3" (UID: "e7fcfc70-0a88-4099-869c-aed4a16dc1a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.274212 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a0491e-8184-4312-a283-91f394d597ff-logs" (OuterVolumeSpecName: "logs") pod "f3a0491e-8184-4312-a283-91f394d597ff" (UID: "f3a0491e-8184-4312-a283-91f394d597ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.281058 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a0491e-8184-4312-a283-91f394d597ff-kube-api-access-2jj8m" (OuterVolumeSpecName: "kube-api-access-2jj8m") pod "f3a0491e-8184-4312-a283-91f394d597ff" (UID: "f3a0491e-8184-4312-a283-91f394d597ff"). InnerVolumeSpecName "kube-api-access-2jj8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.283397 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-kube-api-access-28gln" (OuterVolumeSpecName: "kube-api-access-28gln") pod "e7fcfc70-0a88-4099-869c-aed4a16dc1a3" (UID: "e7fcfc70-0a88-4099-869c-aed4a16dc1a3"). InnerVolumeSpecName "kube-api-access-28gln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.300760 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-config-data" (OuterVolumeSpecName: "config-data") pod "e7fcfc70-0a88-4099-869c-aed4a16dc1a3" (UID: "e7fcfc70-0a88-4099-869c-aed4a16dc1a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.310305 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a0491e-8184-4312-a283-91f394d597ff-config-data" (OuterVolumeSpecName: "config-data") pod "f3a0491e-8184-4312-a283-91f394d597ff" (UID: "f3a0491e-8184-4312-a283-91f394d597ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.375611 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.375900 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.375910 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a0491e-8184-4312-a283-91f394d597ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.375919 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jj8m\" (UniqueName: \"kubernetes.io/projected/f3a0491e-8184-4312-a283-91f394d597ff-kube-api-access-2jj8m\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.375934 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a0491e-8184-4312-a283-91f394d597ff-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.375943 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28gln\" (UniqueName: \"kubernetes.io/projected/e7fcfc70-0a88-4099-869c-aed4a16dc1a3-kube-api-access-28gln\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.500226 4842 generic.go:334] "Generic (PLEG): container finished" podID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerID="042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442" exitCode=0 Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.500310 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-1" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.500309 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-1" event={"ID":"e7fcfc70-0a88-4099-869c-aed4a16dc1a3","Type":"ContainerDied","Data":"042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442"} Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.500420 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-1" event={"ID":"e7fcfc70-0a88-4099-869c-aed4a16dc1a3","Type":"ContainerDied","Data":"66a781bf318816afc880bc99c923a44225ada67276c2f2d07d7833684d78937d"} Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.500446 4842 scope.go:117] "RemoveContainer" containerID="042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.502838 4842 generic.go:334] "Generic (PLEG): container finished" podID="f3a0491e-8184-4312-a283-91f394d597ff" containerID="46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315" exitCode=0 Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.502860 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-2" event={"ID":"f3a0491e-8184-4312-a283-91f394d597ff","Type":"ContainerDied","Data":"46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315"} Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.502881 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-2" event={"ID":"f3a0491e-8184-4312-a283-91f394d597ff","Type":"ContainerDied","Data":"25c8fc07ae90aca740fedb00583ec4792963927273a2d232db86ce0fd2a97f82"} Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.502947 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-2" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.523492 4842 scope.go:117] "RemoveContainer" containerID="2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.613915 4842 scope.go:117] "RemoveContainer" containerID="042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442" Mar 11 19:18:43 crc kubenswrapper[4842]: E0311 19:18:43.616359 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442\": container with ID starting with 042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442 not found: ID does not exist" containerID="042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.616397 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442"} err="failed to get container status \"042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442\": rpc error: code = NotFound desc = could not find container \"042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442\": container with ID starting with 042432f6dfc4cb4c0671e84b9db98333fc7f6413f44ffeae37f0afe7d6ab0442 not found: ID does not exist" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.616422 4842 scope.go:117] "RemoveContainer" containerID="2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d" Mar 11 19:18:43 crc kubenswrapper[4842]: E0311 19:18:43.616801 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d\": container with ID starting with 2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d not found: ID does not exist" containerID="2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.616841 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d"} err="failed to get container status \"2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d\": rpc error: code = NotFound desc = could not find container \"2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d\": container with ID starting with 2c53ce6cdb6e324d612a4c1d8667483f6c09318e2182deaad1bed04cd125816d not found: ID does not exist" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.616868 4842 scope.go:117] "RemoveContainer" containerID="46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.623327 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-1"] Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.628993 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-1"] Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.641849 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-2"] Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.650765 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-2"] Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.653229 4842 scope.go:117] "RemoveContainer" containerID="bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.678717 4842 scope.go:117] "RemoveContainer" containerID="46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315" Mar 11 19:18:43 crc kubenswrapper[4842]: E0311 19:18:43.679672 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315\": container with ID starting with 46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315 not found: ID does not exist" containerID="46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.679713 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315"} err="failed to get container status \"46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315\": rpc error: code = NotFound desc = could not find container \"46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315\": container with ID starting with 46e6df2863880914c312127590aff6f2d6176ba68f32fb83d2f845e212215315 not found: ID does not exist" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.679744 4842 scope.go:117] "RemoveContainer" containerID="bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a" Mar 11 19:18:43 crc kubenswrapper[4842]: E0311 19:18:43.680184 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a\": container with ID starting with bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a not found: ID does not exist" containerID="bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.680219 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a"} err="failed to get container status \"bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a\": rpc error: code = NotFound desc = could not find container \"bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a\": container with ID starting with bba4b0433faf1dbfe7ebe0f6f8c86de4fcc134cdc71a534597a85b829899f26a not found: ID does not exist" Mar 11 19:18:43 crc kubenswrapper[4842]: I0311 19:18:43.962476 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:18:43 crc kubenswrapper[4842]: E0311 19:18:43.962672 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:18:44 crc kubenswrapper[4842]: I0311 19:18:44.991198 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" path="/var/lib/kubelet/pods/e7fcfc70-0a88-4099-869c-aed4a16dc1a3/volumes" Mar 11 19:18:44 crc kubenswrapper[4842]: I0311 19:18:44.992839 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a0491e-8184-4312-a283-91f394d597ff" path="/var/lib/kubelet/pods/f3a0491e-8184-4312-a283-91f394d597ff/volumes" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.379747 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.389429 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.511085 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241ab90a-71bc-4e09-a4e3-e620a090cdbf-config-data\") pod \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.511153 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4llrv\" (UniqueName: \"kubernetes.io/projected/8cffefc0-8682-44f3-8b07-6d766905faf6-kube-api-access-4llrv\") pod \"8cffefc0-8682-44f3-8b07-6d766905faf6\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.511241 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cffefc0-8682-44f3-8b07-6d766905faf6-config-data\") pod \"8cffefc0-8682-44f3-8b07-6d766905faf6\" (UID: \"8cffefc0-8682-44f3-8b07-6d766905faf6\") " Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.511308 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvb2m\" (UniqueName: \"kubernetes.io/projected/241ab90a-71bc-4e09-a4e3-e620a090cdbf-kube-api-access-bvb2m\") pod \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\" (UID: \"241ab90a-71bc-4e09-a4e3-e620a090cdbf\") " Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.517892 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241ab90a-71bc-4e09-a4e3-e620a090cdbf-kube-api-access-bvb2m" (OuterVolumeSpecName: "kube-api-access-bvb2m") pod "241ab90a-71bc-4e09-a4e3-e620a090cdbf" (UID: "241ab90a-71bc-4e09-a4e3-e620a090cdbf"). InnerVolumeSpecName "kube-api-access-bvb2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.522661 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cffefc0-8682-44f3-8b07-6d766905faf6-kube-api-access-4llrv" (OuterVolumeSpecName: "kube-api-access-4llrv") pod "8cffefc0-8682-44f3-8b07-6d766905faf6" (UID: "8cffefc0-8682-44f3-8b07-6d766905faf6"). InnerVolumeSpecName "kube-api-access-4llrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.527668 4842 generic.go:334] "Generic (PLEG): container finished" podID="8cffefc0-8682-44f3-8b07-6d766905faf6" containerID="6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" exitCode=0 Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.527784 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" event={"ID":"8cffefc0-8682-44f3-8b07-6d766905faf6","Type":"ContainerDied","Data":"6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943"} Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.527781 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.527823 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-1" event={"ID":"8cffefc0-8682-44f3-8b07-6d766905faf6","Type":"ContainerDied","Data":"5482b0d5b5f502eba5e85e66b2fae5ad39da29738cbeb18d6bd24929947ddd79"} Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.527833 4842 scope.go:117] "RemoveContainer" containerID="6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.529564 4842 generic.go:334] "Generic (PLEG): container finished" podID="241ab90a-71bc-4e09-a4e3-e620a090cdbf" containerID="7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" exitCode=0 Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.529600 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" event={"ID":"241ab90a-71bc-4e09-a4e3-e620a090cdbf","Type":"ContainerDied","Data":"7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa"} Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.529619 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" event={"ID":"241ab90a-71bc-4e09-a4e3-e620a090cdbf","Type":"ContainerDied","Data":"9e180029f3feca34c8f0094bc6e85ee39cfcce62a8f1de122ab66fda27dd7e63"} Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.529589 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-2" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.542113 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cffefc0-8682-44f3-8b07-6d766905faf6-config-data" (OuterVolumeSpecName: "config-data") pod "8cffefc0-8682-44f3-8b07-6d766905faf6" (UID: "8cffefc0-8682-44f3-8b07-6d766905faf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.559965 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241ab90a-71bc-4e09-a4e3-e620a090cdbf-config-data" (OuterVolumeSpecName: "config-data") pod "241ab90a-71bc-4e09-a4e3-e620a090cdbf" (UID: "241ab90a-71bc-4e09-a4e3-e620a090cdbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.591147 4842 scope.go:117] "RemoveContainer" containerID="6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" Mar 11 19:18:45 crc kubenswrapper[4842]: E0311 19:18:45.591618 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943\": container with ID starting with 6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943 not found: ID does not exist" containerID="6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.591666 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943"} err="failed to get container status \"6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943\": rpc error: code = NotFound desc = could not find container \"6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943\": container with ID starting with 6e2bd11206bbdfef2a400f15b9ff6a22b7763de53ae5131946b4c66ece64a943 not found: ID does not exist" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.591698 4842 scope.go:117] "RemoveContainer" containerID="7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.613329 4842 scope.go:117] "RemoveContainer" containerID="7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" Mar 11 19:18:45 crc kubenswrapper[4842]: E0311 19:18:45.613694 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa\": container with ID starting with 7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa not found: ID does not exist" containerID="7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.613740 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241ab90a-71bc-4e09-a4e3-e620a090cdbf-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.613757 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4llrv\" (UniqueName: \"kubernetes.io/projected/8cffefc0-8682-44f3-8b07-6d766905faf6-kube-api-access-4llrv\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.613747 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa"} err="failed to get container status \"7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa\": rpc error: code = NotFound desc = could not find container \"7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa\": container with ID starting with 7fb691d6f88d642e3a01eb43ae9cc079d8f5bd4a1bd57228c6445c99f4c456aa not found: ID does not exist" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.613768 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cffefc0-8682-44f3-8b07-6d766905faf6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.613805 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvb2m\" (UniqueName: \"kubernetes.io/projected/241ab90a-71bc-4e09-a4e3-e620a090cdbf-kube-api-access-bvb2m\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.896321 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-1"] Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.908208 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-1"] Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.915250 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-2"] Mar 11 19:18:45 crc kubenswrapper[4842]: I0311 19:18:45.921476 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-2"] Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.178362 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-2"] Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.178629 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-2" podUID="0b485bd9-dd54-4ba8-b27f-ceda50b858f8" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.187656 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-1"] Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.187940 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-1" podUID="4ea0f15c-863a-46b4-9a4f-42df55730e40" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.262235 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-2"] Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.262661 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-2" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-log" containerID="cri-o://8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.262752 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-2" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.325340 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-1"] Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.325546 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-1" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-log" containerID="cri-o://d2865305926eb583aa015c3bc2be7dd7853774c055cc2c04626a5c1e181dea17" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.325919 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-1" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://ec2ae559ee2a537cd01bf3df65efbfa656b17172c8d7754c869d37954dd4734d" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.540065 4842 generic.go:334] "Generic (PLEG): container finished" podID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerID="8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a" exitCode=143 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.540126 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-2" event={"ID":"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961","Type":"ContainerDied","Data":"8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a"} Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.542123 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerID="d2865305926eb583aa015c3bc2be7dd7853774c055cc2c04626a5c1e181dea17" exitCode=143 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.542190 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-1" event={"ID":"fc95ee94-67ba-4093-a29d-846ab4c1d6c0","Type":"ContainerDied","Data":"d2865305926eb583aa015c3bc2be7dd7853774c055cc2c04626a5c1e181dea17"} Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.553603 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-2"] Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.553837 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" podUID="1335c672-48b7-46e6-a70e-eb54e14ce800" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.563653 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-1"] Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.564094 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" podUID="502dc472-1dee-4d14-97a3-38494f63d086" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155" gracePeriod=30 Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.972251 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241ab90a-71bc-4e09-a4e3-e620a090cdbf" path="/var/lib/kubelet/pods/241ab90a-71bc-4e09-a4e3-e620a090cdbf/volumes" Mar 11 19:18:46 crc kubenswrapper[4842]: I0311 19:18:46.973352 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cffefc0-8682-44f3-8b07-6d766905faf6" path="/var/lib/kubelet/pods/8cffefc0-8682-44f3-8b07-6d766905faf6/volumes" Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.895501 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.897930 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.899318 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.899369 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-scheduler-2" podUID="0b485bd9-dd54-4ba8-b27f-ceda50b858f8" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.912798 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.914308 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.916481 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:18:47 crc kubenswrapper[4842]: E0311 19:18:47.916538 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-scheduler-1" podUID="4ea0f15c-863a-46b4-9a4f-42df55730e40" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.275686 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.278758 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.283858 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.283941 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" podUID="502dc472-1dee-4d14-97a3-38494f63d086" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.293654 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.295640 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.298016 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:18:49 crc kubenswrapper[4842]: E0311 19:18:49.298152 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" podUID="1335c672-48b7-46e6-a70e-eb54e14ce800" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:18:49 crc kubenswrapper[4842]: I0311 19:18:49.580918 4842 generic.go:334] "Generic (PLEG): container finished" podID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerID="ec2ae559ee2a537cd01bf3df65efbfa656b17172c8d7754c869d37954dd4734d" exitCode=0 Mar 11 19:18:49 crc kubenswrapper[4842]: I0311 19:18:49.580967 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-1" event={"ID":"fc95ee94-67ba-4093-a29d-846ab4c1d6c0","Type":"ContainerDied","Data":"ec2ae559ee2a537cd01bf3df65efbfa656b17172c8d7754c869d37954dd4734d"} Mar 11 19:18:49 crc kubenswrapper[4842]: I0311 19:18:49.918680 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:49 crc kubenswrapper[4842]: I0311 19:18:49.922858 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d486n\" (UniqueName: \"kubernetes.io/projected/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-kube-api-access-d486n\") pod \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " Mar 11 19:18:49 crc kubenswrapper[4842]: I0311 19:18:49.922952 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-config-data\") pod \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " Mar 11 19:18:49 crc kubenswrapper[4842]: I0311 19:18:49.930440 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-kube-api-access-d486n" (OuterVolumeSpecName: "kube-api-access-d486n") pod "fc95ee94-67ba-4093-a29d-846ab4c1d6c0" (UID: "fc95ee94-67ba-4093-a29d-846ab4c1d6c0"). InnerVolumeSpecName "kube-api-access-d486n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:49 crc kubenswrapper[4842]: I0311 19:18:49.982622 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-config-data" (OuterVolumeSpecName: "config-data") pod "fc95ee94-67ba-4093-a29d-846ab4c1d6c0" (UID: "fc95ee94-67ba-4093-a29d-846ab4c1d6c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.025046 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-logs\") pod \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\" (UID: \"fc95ee94-67ba-4093-a29d-846ab4c1d6c0\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.025481 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-logs" (OuterVolumeSpecName: "logs") pod "fc95ee94-67ba-4093-a29d-846ab4c1d6c0" (UID: "fc95ee94-67ba-4093-a29d-846ab4c1d6c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.025611 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d486n\" (UniqueName: \"kubernetes.io/projected/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-kube-api-access-d486n\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.025629 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.025640 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc95ee94-67ba-4093-a29d-846ab4c1d6c0-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.042667 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.127055 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-config-data\") pod \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.127231 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpjx8\" (UniqueName: \"kubernetes.io/projected/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-kube-api-access-fpjx8\") pod \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.127303 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-logs\") pod \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\" (UID: \"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.127984 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-logs" (OuterVolumeSpecName: "logs") pod "f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" (UID: "f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.128261 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.133510 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-kube-api-access-fpjx8" (OuterVolumeSpecName: "kube-api-access-fpjx8") pod "f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" (UID: "f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961"). InnerVolumeSpecName "kube-api-access-fpjx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.157523 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-config-data" (OuterVolumeSpecName: "config-data") pod "f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" (UID: "f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.230148 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.230180 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpjx8\" (UniqueName: \"kubernetes.io/projected/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961-kube-api-access-fpjx8\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.266319 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.331772 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-config-data\") pod \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.332092 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr7v8\" (UniqueName: \"kubernetes.io/projected/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-kube-api-access-tr7v8\") pod \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\" (UID: \"0b485bd9-dd54-4ba8-b27f-ceda50b858f8\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.334703 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-kube-api-access-tr7v8" (OuterVolumeSpecName: "kube-api-access-tr7v8") pod "0b485bd9-dd54-4ba8-b27f-ceda50b858f8" (UID: "0b485bd9-dd54-4ba8-b27f-ceda50b858f8"). InnerVolumeSpecName "kube-api-access-tr7v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.354483 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-config-data" (OuterVolumeSpecName: "config-data") pod "0b485bd9-dd54-4ba8-b27f-ceda50b858f8" (UID: "0b485bd9-dd54-4ba8-b27f-ceda50b858f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.434498 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.434585 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr7v8\" (UniqueName: \"kubernetes.io/projected/0b485bd9-dd54-4ba8-b27f-ceda50b858f8-kube-api-access-tr7v8\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.466938 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.535635 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea0f15c-863a-46b4-9a4f-42df55730e40-config-data\") pod \"4ea0f15c-863a-46b4-9a4f-42df55730e40\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.535700 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjskn\" (UniqueName: \"kubernetes.io/projected/4ea0f15c-863a-46b4-9a4f-42df55730e40-kube-api-access-bjskn\") pod \"4ea0f15c-863a-46b4-9a4f-42df55730e40\" (UID: \"4ea0f15c-863a-46b4-9a4f-42df55730e40\") " Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.538539 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea0f15c-863a-46b4-9a4f-42df55730e40-kube-api-access-bjskn" (OuterVolumeSpecName: "kube-api-access-bjskn") pod "4ea0f15c-863a-46b4-9a4f-42df55730e40" (UID: "4ea0f15c-863a-46b4-9a4f-42df55730e40"). InnerVolumeSpecName "kube-api-access-bjskn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.555633 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea0f15c-863a-46b4-9a4f-42df55730e40-config-data" (OuterVolumeSpecName: "config-data") pod "4ea0f15c-863a-46b4-9a4f-42df55730e40" (UID: "4ea0f15c-863a-46b4-9a4f-42df55730e40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.596243 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-1" event={"ID":"fc95ee94-67ba-4093-a29d-846ab4c1d6c0","Type":"ContainerDied","Data":"962855932e5a2b95e659e50bbcc99c5998bbeb49d37db0b3495738db334a69b1"} Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.596257 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-1" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.596422 4842 scope.go:117] "RemoveContainer" containerID="ec2ae559ee2a537cd01bf3df65efbfa656b17172c8d7754c869d37954dd4734d" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.600055 4842 generic.go:334] "Generic (PLEG): container finished" podID="0b485bd9-dd54-4ba8-b27f-ceda50b858f8" containerID="ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" exitCode=0 Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.600115 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-2" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.600197 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-2" event={"ID":"0b485bd9-dd54-4ba8-b27f-ceda50b858f8","Type":"ContainerDied","Data":"ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1"} Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.600225 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-2" event={"ID":"0b485bd9-dd54-4ba8-b27f-ceda50b858f8","Type":"ContainerDied","Data":"87c8090253e638a7ed3915cc6947133db35efd0ee991fd919725168a89d6bd08"} Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.603038 4842 generic.go:334] "Generic (PLEG): container finished" podID="4ea0f15c-863a-46b4-9a4f-42df55730e40" containerID="abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" exitCode=0 Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.603249 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-1" event={"ID":"4ea0f15c-863a-46b4-9a4f-42df55730e40","Type":"ContainerDied","Data":"abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25"} Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.603287 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-1" event={"ID":"4ea0f15c-863a-46b4-9a4f-42df55730e40","Type":"ContainerDied","Data":"b4bcd227d8218dbe9f302f595aa953418a06c2159aa14dbcc22309c1eaf50501"} Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.603335 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-1" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.609547 4842 generic.go:334] "Generic (PLEG): container finished" podID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerID="ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414" exitCode=0 Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.609597 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-2" event={"ID":"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961","Type":"ContainerDied","Data":"ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414"} Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.609629 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-2" event={"ID":"f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961","Type":"ContainerDied","Data":"d252eefd8f737d9cd965cd708b5df805069970708286edcdbc5e3da27a6ee93c"} Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.609697 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-2" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.638401 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea0f15c-863a-46b4-9a4f-42df55730e40-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.638618 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjskn\" (UniqueName: \"kubernetes.io/projected/4ea0f15c-863a-46b4-9a4f-42df55730e40-kube-api-access-bjskn\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.661631 4842 scope.go:117] "RemoveContainer" containerID="d2865305926eb583aa015c3bc2be7dd7853774c055cc2c04626a5c1e181dea17" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.678967 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-2"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.688303 4842 scope.go:117] "RemoveContainer" containerID="ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.690872 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-2"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.699213 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-1"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.712665 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-1"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.718827 4842 scope.go:117] "RemoveContainer" containerID="ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" Mar 11 19:18:50 crc kubenswrapper[4842]: E0311 19:18:50.719351 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1\": container with ID starting with ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1 not found: ID does not exist" containerID="ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.719395 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1"} err="failed to get container status \"ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1\": rpc error: code = NotFound desc = could not find container \"ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1\": container with ID starting with ba0f4690e3fc8b6e52813da6f5a3ca2b354e4a21a402fcbdaf552d0404f13ad1 not found: ID does not exist" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.719425 4842 scope.go:117] "RemoveContainer" containerID="abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.720768 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-1"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.728322 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-1"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.734901 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-2"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.740412 4842 scope.go:117] "RemoveContainer" containerID="abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" Mar 11 19:18:50 crc kubenswrapper[4842]: E0311 19:18:50.740833 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25\": container with ID starting with abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25 not found: ID does not exist" containerID="abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.740859 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25"} err="failed to get container status \"abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25\": rpc error: code = NotFound desc = could not find container \"abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25\": container with ID starting with abebba1288fcfeaef117dff714757def98353dd5ecc411762fb9ba7cb98dfc25 not found: ID does not exist" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.740879 4842 scope.go:117] "RemoveContainer" containerID="ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.741563 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-2"] Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.762105 4842 scope.go:117] "RemoveContainer" containerID="8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.786502 4842 scope.go:117] "RemoveContainer" containerID="ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414" Mar 11 19:18:50 crc kubenswrapper[4842]: E0311 19:18:50.786954 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414\": container with ID starting with ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414 not found: ID does not exist" containerID="ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.786989 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414"} err="failed to get container status \"ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414\": rpc error: code = NotFound desc = could not find container \"ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414\": container with ID starting with ed4befa2f1ad690b128c3aac04f31d902c28cd8b4073bac4882a026ab2344414 not found: ID does not exist" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.787009 4842 scope.go:117] "RemoveContainer" containerID="8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a" Mar 11 19:18:50 crc kubenswrapper[4842]: E0311 19:18:50.787469 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a\": container with ID starting with 8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a not found: ID does not exist" containerID="8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.787573 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a"} err="failed to get container status \"8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a\": rpc error: code = NotFound desc = could not find container \"8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a\": container with ID starting with 8b5ff022ea697fff8abbb1b2de70a892088542fc639888c58f1a0078345e8a6a not found: ID does not exist" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.971753 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b485bd9-dd54-4ba8-b27f-ceda50b858f8" path="/var/lib/kubelet/pods/0b485bd9-dd54-4ba8-b27f-ceda50b858f8/volumes" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.972236 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea0f15c-863a-46b4-9a4f-42df55730e40" path="/var/lib/kubelet/pods/4ea0f15c-863a-46b4-9a4f-42df55730e40/volumes" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.972750 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" path="/var/lib/kubelet/pods/f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961/volumes" Mar 11 19:18:50 crc kubenswrapper[4842]: I0311 19:18:50.973728 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" path="/var/lib/kubelet/pods/fc95ee94-67ba-4093-a29d-846ab4c1d6c0/volumes" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.631783 4842 generic.go:334] "Generic (PLEG): container finished" podID="1335c672-48b7-46e6-a70e-eb54e14ce800" containerID="9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a" exitCode=0 Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.632104 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" event={"ID":"1335c672-48b7-46e6-a70e-eb54e14ce800","Type":"ContainerDied","Data":"9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a"} Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.634484 4842 generic.go:334] "Generic (PLEG): container finished" podID="502dc472-1dee-4d14-97a3-38494f63d086" containerID="30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155" exitCode=0 Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.634621 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" event={"ID":"502dc472-1dee-4d14-97a3-38494f63d086","Type":"ContainerDied","Data":"30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155"} Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.759678 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.766071 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.856257 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8l8j\" (UniqueName: \"kubernetes.io/projected/1335c672-48b7-46e6-a70e-eb54e14ce800-kube-api-access-c8l8j\") pod \"1335c672-48b7-46e6-a70e-eb54e14ce800\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.856357 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jssq\" (UniqueName: \"kubernetes.io/projected/502dc472-1dee-4d14-97a3-38494f63d086-kube-api-access-2jssq\") pod \"502dc472-1dee-4d14-97a3-38494f63d086\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.856472 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1335c672-48b7-46e6-a70e-eb54e14ce800-config-data\") pod \"1335c672-48b7-46e6-a70e-eb54e14ce800\" (UID: \"1335c672-48b7-46e6-a70e-eb54e14ce800\") " Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.856533 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502dc472-1dee-4d14-97a3-38494f63d086-config-data\") pod \"502dc472-1dee-4d14-97a3-38494f63d086\" (UID: \"502dc472-1dee-4d14-97a3-38494f63d086\") " Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.867637 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1335c672-48b7-46e6-a70e-eb54e14ce800-kube-api-access-c8l8j" (OuterVolumeSpecName: "kube-api-access-c8l8j") pod "1335c672-48b7-46e6-a70e-eb54e14ce800" (UID: "1335c672-48b7-46e6-a70e-eb54e14ce800"). InnerVolumeSpecName "kube-api-access-c8l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.867687 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502dc472-1dee-4d14-97a3-38494f63d086-kube-api-access-2jssq" (OuterVolumeSpecName: "kube-api-access-2jssq") pod "502dc472-1dee-4d14-97a3-38494f63d086" (UID: "502dc472-1dee-4d14-97a3-38494f63d086"). InnerVolumeSpecName "kube-api-access-2jssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.899742 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502dc472-1dee-4d14-97a3-38494f63d086-config-data" (OuterVolumeSpecName: "config-data") pod "502dc472-1dee-4d14-97a3-38494f63d086" (UID: "502dc472-1dee-4d14-97a3-38494f63d086"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.901894 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1335c672-48b7-46e6-a70e-eb54e14ce800-config-data" (OuterVolumeSpecName: "config-data") pod "1335c672-48b7-46e6-a70e-eb54e14ce800" (UID: "1335c672-48b7-46e6-a70e-eb54e14ce800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.958569 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502dc472-1dee-4d14-97a3-38494f63d086-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.958614 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8l8j\" (UniqueName: \"kubernetes.io/projected/1335c672-48b7-46e6-a70e-eb54e14ce800-kube-api-access-c8l8j\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.958626 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jssq\" (UniqueName: \"kubernetes.io/projected/502dc472-1dee-4d14-97a3-38494f63d086-kube-api-access-2jssq\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:51 crc kubenswrapper[4842]: I0311 19:18:51.958636 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1335c672-48b7-46e6-a70e-eb54e14ce800-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.057160 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/keystone-db-create-xwxph"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.067010 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/keystone-06c9-account-create-update-ln4k5"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.075805 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/placement-db-create-mwt27"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.082814 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/placement-2bc1-account-create-update-zklwm"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.088259 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/keystone-06c9-account-create-update-ln4k5"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.094311 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/placement-db-create-mwt27"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.100303 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/keystone-db-create-xwxph"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.107032 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/placement-2bc1-account-create-update-zklwm"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.646564 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" event={"ID":"1335c672-48b7-46e6-a70e-eb54e14ce800","Type":"ContainerDied","Data":"a7cfa960a911c714077274b760a6cfe2885dfe31e66fc0cc3de691927d52ac5e"} Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.646623 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-2" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.646631 4842 scope.go:117] "RemoveContainer" containerID="9dcf083bb2bc76fe54837a9dba4c395cf60dceced5a2d320d8352e52d546e32a" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.649283 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" event={"ID":"502dc472-1dee-4d14-97a3-38494f63d086","Type":"ContainerDied","Data":"bf9b6d996d17956edfb1577866f524199996bac0d0305d7199c3bf332e251e48"} Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.649398 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-1" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.681736 4842 scope.go:117] "RemoveContainer" containerID="30d99caa543b307a5e70b6765a95cff177a5693f4aae9ae2a02c7db019663155" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.707995 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-2"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.718955 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-2"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.725175 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-1"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.731088 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-1"] Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.972227 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1335c672-48b7-46e6-a70e-eb54e14ce800" path="/var/lib/kubelet/pods/1335c672-48b7-46e6-a70e-eb54e14ce800/volumes" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.972967 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502dc472-1dee-4d14-97a3-38494f63d086" path="/var/lib/kubelet/pods/502dc472-1dee-4d14-97a3-38494f63d086/volumes" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.973540 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f35e50-4944-4740-a5a2-f35bfc66b4d7" path="/var/lib/kubelet/pods/51f35e50-4944-4740-a5a2-f35bfc66b4d7/volumes" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.974005 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86fa2647-8583-4275-a4ed-f664fb1b1c20" path="/var/lib/kubelet/pods/86fa2647-8583-4275-a4ed-f664fb1b1c20/volumes" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.974962 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcf6712-e4ab-4aa4-848e-46de1967ef16" path="/var/lib/kubelet/pods/ddcf6712-e4ab-4aa4-848e-46de1967ef16/volumes" Mar 11 19:18:52 crc kubenswrapper[4842]: I0311 19:18:52.975497 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b7de43-4cd4-4c79-bc25-f88450b0b0fa" path="/var/lib/kubelet/pods/f9b7de43-4cd4-4c79-bc25-f88450b0b0fa/volumes" Mar 11 19:18:57 crc kubenswrapper[4842]: I0311 19:18:57.961814 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:18:57 crc kubenswrapper[4842]: E0311 19:18:57.962504 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:19:02 crc kubenswrapper[4842]: I0311 19:19:02.061872 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/root-account-create-update-5v6ch"] Mar 11 19:19:02 crc kubenswrapper[4842]: I0311 19:19:02.069100 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/root-account-create-update-5v6ch"] Mar 11 19:19:02 crc kubenswrapper[4842]: I0311 19:19:02.974809 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039d49b9-4b53-43b7-9d1b-871c543d17ed" path="/var/lib/kubelet/pods/039d49b9-4b53-43b7-9d1b-871c543d17ed/volumes" Mar 11 19:19:05 crc kubenswrapper[4842]: I0311 19:19:05.144765 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:05 crc kubenswrapper[4842]: I0311 19:19:05.145547 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-log" containerID="cri-o://560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0" gracePeriod=30 Mar 11 19:19:05 crc kubenswrapper[4842]: I0311 19:19:05.145743 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-api" containerID="cri-o://016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2" gracePeriod=30 Mar 11 19:19:05 crc kubenswrapper[4842]: I0311 19:19:05.552767 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:19:05 crc kubenswrapper[4842]: I0311 19:19:05.553029 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="63c2afef-0b62-427f-942e-330b7a88f2b3" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" gracePeriod=30 Mar 11 19:19:05 crc kubenswrapper[4842]: I0311 19:19:05.774298 4842 generic.go:334] "Generic (PLEG): container finished" podID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerID="560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0" exitCode=143 Mar 11 19:19:05 crc kubenswrapper[4842]: I0311 19:19:05.774316 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"3c57a473-220e-4c5e-961c-7d5b738ced0f","Type":"ContainerDied","Data":"560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0"} Mar 11 19:19:07 crc kubenswrapper[4842]: E0311 19:19:07.593858 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:19:07 crc kubenswrapper[4842]: E0311 19:19:07.596675 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:19:07 crc kubenswrapper[4842]: E0311 19:19:07.603889 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:19:07 crc kubenswrapper[4842]: E0311 19:19:07.604871 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="63c2afef-0b62-427f-942e-330b7a88f2b3" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.673315 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.758284 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57a473-220e-4c5e-961c-7d5b738ced0f-logs\") pod \"3c57a473-220e-4c5e-961c-7d5b738ced0f\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.758691 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrm5l\" (UniqueName: \"kubernetes.io/projected/3c57a473-220e-4c5e-961c-7d5b738ced0f-kube-api-access-jrm5l\") pod \"3c57a473-220e-4c5e-961c-7d5b738ced0f\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.758775 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57a473-220e-4c5e-961c-7d5b738ced0f-config-data\") pod \"3c57a473-220e-4c5e-961c-7d5b738ced0f\" (UID: \"3c57a473-220e-4c5e-961c-7d5b738ced0f\") " Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.758819 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c57a473-220e-4c5e-961c-7d5b738ced0f-logs" (OuterVolumeSpecName: "logs") pod "3c57a473-220e-4c5e-961c-7d5b738ced0f" (UID: "3c57a473-220e-4c5e-961c-7d5b738ced0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.759158 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57a473-220e-4c5e-961c-7d5b738ced0f-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.767646 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c57a473-220e-4c5e-961c-7d5b738ced0f-kube-api-access-jrm5l" (OuterVolumeSpecName: "kube-api-access-jrm5l") pod "3c57a473-220e-4c5e-961c-7d5b738ced0f" (UID: "3c57a473-220e-4c5e-961c-7d5b738ced0f"). InnerVolumeSpecName "kube-api-access-jrm5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.781307 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c57a473-220e-4c5e-961c-7d5b738ced0f-config-data" (OuterVolumeSpecName: "config-data") pod "3c57a473-220e-4c5e-961c-7d5b738ced0f" (UID: "3c57a473-220e-4c5e-961c-7d5b738ced0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.800637 4842 generic.go:334] "Generic (PLEG): container finished" podID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerID="016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2" exitCode=0 Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.800689 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.800693 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"3c57a473-220e-4c5e-961c-7d5b738ced0f","Type":"ContainerDied","Data":"016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2"} Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.800844 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"3c57a473-220e-4c5e-961c-7d5b738ced0f","Type":"ContainerDied","Data":"a82fd9c81c082dc8e0837f08e0caa91a67d8f4b21ebd6975dc407f03fea997b6"} Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.800880 4842 scope.go:117] "RemoveContainer" containerID="016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.838448 4842 scope.go:117] "RemoveContainer" containerID="560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.843360 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.855378 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.860928 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrm5l\" (UniqueName: \"kubernetes.io/projected/3c57a473-220e-4c5e-961c-7d5b738ced0f-kube-api-access-jrm5l\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.860982 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57a473-220e-4c5e-961c-7d5b738ced0f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.863974 4842 scope.go:117] "RemoveContainer" containerID="016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2" Mar 11 19:19:08 crc kubenswrapper[4842]: E0311 19:19:08.864544 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2\": container with ID starting with 016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2 not found: ID does not exist" containerID="016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.864607 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2"} err="failed to get container status \"016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2\": rpc error: code = NotFound desc = could not find container \"016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2\": container with ID starting with 016820f812271080286a310a8846b7f57eb83ba1ba4523de5526edc510aee8e2 not found: ID does not exist" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.864652 4842 scope.go:117] "RemoveContainer" containerID="560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0" Mar 11 19:19:08 crc kubenswrapper[4842]: E0311 19:19:08.865166 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0\": container with ID starting with 560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0 not found: ID does not exist" containerID="560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.865254 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0"} err="failed to get container status \"560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0\": rpc error: code = NotFound desc = could not find container \"560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0\": container with ID starting with 560cdc84171c836aa3e4d9f6e0562abe1f257166513d49161b2aa83f2e51ddd0 not found: ID does not exist" Mar 11 19:19:08 crc kubenswrapper[4842]: I0311 19:19:08.972317 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" path="/var/lib/kubelet/pods/3c57a473-220e-4c5e-961c-7d5b738ced0f/volumes" Mar 11 19:19:09 crc kubenswrapper[4842]: I0311 19:19:09.962424 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:19:09 crc kubenswrapper[4842]: E0311 19:19:09.963060 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.299611 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.303414 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c2afef-0b62-427f-942e-330b7a88f2b3-config-data\") pod \"63c2afef-0b62-427f-942e-330b7a88f2b3\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.303529 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx2m4\" (UniqueName: \"kubernetes.io/projected/63c2afef-0b62-427f-942e-330b7a88f2b3-kube-api-access-fx2m4\") pod \"63c2afef-0b62-427f-942e-330b7a88f2b3\" (UID: \"63c2afef-0b62-427f-942e-330b7a88f2b3\") " Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.308938 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c2afef-0b62-427f-942e-330b7a88f2b3-kube-api-access-fx2m4" (OuterVolumeSpecName: "kube-api-access-fx2m4") pod "63c2afef-0b62-427f-942e-330b7a88f2b3" (UID: "63c2afef-0b62-427f-942e-330b7a88f2b3"). InnerVolumeSpecName "kube-api-access-fx2m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.337555 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c2afef-0b62-427f-942e-330b7a88f2b3-config-data" (OuterVolumeSpecName: "config-data") pod "63c2afef-0b62-427f-942e-330b7a88f2b3" (UID: "63c2afef-0b62-427f-942e-330b7a88f2b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.405074 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c2afef-0b62-427f-942e-330b7a88f2b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.405381 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx2m4\" (UniqueName: \"kubernetes.io/projected/63c2afef-0b62-427f-942e-330b7a88f2b3-kube-api-access-fx2m4\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.844299 4842 generic.go:334] "Generic (PLEG): container finished" podID="63c2afef-0b62-427f-942e-330b7a88f2b3" containerID="5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" exitCode=0 Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.844369 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.844365 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"63c2afef-0b62-427f-942e-330b7a88f2b3","Type":"ContainerDied","Data":"5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589"} Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.844520 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"63c2afef-0b62-427f-942e-330b7a88f2b3","Type":"ContainerDied","Data":"4fb7de1146cab22a60720584aa6483ac50b99497f6dc06732724bc814f2343f2"} Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.844559 4842 scope.go:117] "RemoveContainer" containerID="5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.851193 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.851442 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="a39b42bf-877b-4b5a-b0a3-998aa208a41d" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://c3ad2425978b65f6439feb572682b513835f76760e6cefca16113e56c23118f3" gracePeriod=30 Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.879373 4842 scope.go:117] "RemoveContainer" containerID="5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" Mar 11 19:19:11 crc kubenswrapper[4842]: E0311 19:19:11.879802 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589\": container with ID starting with 5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589 not found: ID does not exist" containerID="5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.879861 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589"} err="failed to get container status \"5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589\": rpc error: code = NotFound desc = could not find container \"5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589\": container with ID starting with 5224c4af06bf0d31fd762cce7a7051cef60f8db3c3cc095d2593327cec1af589 not found: ID does not exist" Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.886606 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.892610 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.945801 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.946037 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-log" containerID="cri-o://11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2" gracePeriod=30 Mar 11 19:19:11 crc kubenswrapper[4842]: I0311 19:19:11.946120 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c" gracePeriod=30 Mar 11 19:19:12 crc kubenswrapper[4842]: I0311 19:19:12.055436 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:19:12 crc kubenswrapper[4842]: I0311 19:19:12.055671 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="6c49e464-bc56-4675-a6e9-9e5997a85430" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0" gracePeriod=30 Mar 11 19:19:12 crc kubenswrapper[4842]: I0311 19:19:12.861395 4842 generic.go:334] "Generic (PLEG): container finished" podID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerID="11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2" exitCode=143 Mar 11 19:19:12 crc kubenswrapper[4842]: I0311 19:19:12.861434 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"a3671efc-bed8-44b2-8663-60692f7a77a6","Type":"ContainerDied","Data":"11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2"} Mar 11 19:19:12 crc kubenswrapper[4842]: I0311 19:19:12.974840 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c2afef-0b62-427f-942e-330b7a88f2b3" path="/var/lib/kubelet/pods/63c2afef-0b62-427f-942e-330b7a88f2b3/volumes" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.018223 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.028232 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.036505 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xssrl"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.045097 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-7mh87"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086140 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell1d291-account-delete-8l6j6"] Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086501 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086521 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-log" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086535 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086544 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086558 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086567 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086585 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b485bd9-dd54-4ba8-b27f-ceda50b858f8" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086593 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b485bd9-dd54-4ba8-b27f-ceda50b858f8" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086608 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086617 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086633 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086642 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086653 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086662 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086677 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502dc472-1dee-4d14-97a3-38494f63d086" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086687 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="502dc472-1dee-4d14-97a3-38494f63d086" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086706 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea0f15c-863a-46b4-9a4f-42df55730e40" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086715 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea0f15c-863a-46b4-9a4f-42df55730e40" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086726 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cffefc0-8682-44f3-8b07-6d766905faf6" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086735 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cffefc0-8682-44f3-8b07-6d766905faf6" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086751 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1335c672-48b7-46e6-a70e-eb54e14ce800" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086761 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1335c672-48b7-46e6-a70e-eb54e14ce800" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086776 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086784 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086798 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c2afef-0b62-427f-942e-330b7a88f2b3" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086807 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c2afef-0b62-427f-942e-330b7a88f2b3" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086825 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086836 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086854 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086865 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-log" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086877 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086885 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:13 crc kubenswrapper[4842]: E0311 19:19:13.086900 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241ab90a-71bc-4e09-a4e3-e620a090cdbf" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.086908 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="241ab90a-71bc-4e09-a4e3-e620a090cdbf" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087087 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087102 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cffefc0-8682-44f3-8b07-6d766905faf6" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087122 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c57a473-220e-4c5e-961c-7d5b738ced0f" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087134 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087145 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1335c672-48b7-46e6-a70e-eb54e14ce800" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087158 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087171 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-api" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087183 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c2afef-0b62-427f-942e-330b7a88f2b3" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087192 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea0f15c-863a-46b4-9a4f-42df55730e40" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087204 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="241ab90a-71bc-4e09-a4e3-e620a090cdbf" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087217 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b485bd9-dd54-4ba8-b27f-ceda50b858f8" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087229 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a0491e-8184-4312-a283-91f394d597ff" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087239 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087250 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a4bfe2-8c06-4bc3-8617-d97d1a7cd961" containerName="nova-kuttl-metadata-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087262 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="502dc472-1dee-4d14-97a3-38494f63d086" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087296 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc95ee94-67ba-4093-a29d-846ab4c1d6c0" containerName="nova-kuttl-metadata-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087310 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fcfc70-0a88-4099-869c-aed4a16dc1a3" containerName="nova-kuttl-api-log" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.087889 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.097900 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1d291-account-delete-8l6j6"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.141068 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v9pj\" (UniqueName: \"kubernetes.io/projected/abf72003-0177-400a-aa79-5f7d957ae91c-kube-api-access-4v9pj\") pod \"novacell1d291-account-delete-8l6j6\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.141371 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abf72003-0177-400a-aa79-5f7d957ae91c-operator-scripts\") pod \"novacell1d291-account-delete-8l6j6\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.170399 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell00c0c-account-delete-2nvn6"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.171540 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.187436 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.187663 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="c31eb1f5-ab2d-48d6-82d8-6af5678a670d" containerName="nova-kuttl-cell1-novncproxy-novncproxy" containerID="cri-o://df1ce42939844cc243229418a966ebb2f97994c12840e662c2d9a0ccb21bbd50" gracePeriod=30 Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.196664 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell00c0c-account-delete-2nvn6"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.207218 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.220822 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-mqfqh"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.228473 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novaapiec26-account-delete-7nbbh"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.229922 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.232503 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapiec26-account-delete-7nbbh"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.243534 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v9pj\" (UniqueName: \"kubernetes.io/projected/abf72003-0177-400a-aa79-5f7d957ae91c-kube-api-access-4v9pj\") pod \"novacell1d291-account-delete-8l6j6\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.243686 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4p6\" (UniqueName: \"kubernetes.io/projected/f85ac3d4-f734-40ad-b192-f4edd1421216-kube-api-access-nl4p6\") pod \"novacell00c0c-account-delete-2nvn6\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.243788 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f85ac3d4-f734-40ad-b192-f4edd1421216-operator-scripts\") pod \"novacell00c0c-account-delete-2nvn6\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.243868 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwtm\" (UniqueName: \"kubernetes.io/projected/9de22683-ed7f-42a2-b24f-8fb00687086b-kube-api-access-rrwtm\") pod \"novaapiec26-account-delete-7nbbh\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.244007 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abf72003-0177-400a-aa79-5f7d957ae91c-operator-scripts\") pod \"novacell1d291-account-delete-8l6j6\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.244135 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9de22683-ed7f-42a2-b24f-8fb00687086b-operator-scripts\") pod \"novaapiec26-account-delete-7nbbh\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.244868 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abf72003-0177-400a-aa79-5f7d957ae91c-operator-scripts\") pod \"novacell1d291-account-delete-8l6j6\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.277059 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v9pj\" (UniqueName: \"kubernetes.io/projected/abf72003-0177-400a-aa79-5f7d957ae91c-kube-api-access-4v9pj\") pod \"novacell1d291-account-delete-8l6j6\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.303328 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.320137 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-t8rld"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.346624 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4p6\" (UniqueName: \"kubernetes.io/projected/f85ac3d4-f734-40ad-b192-f4edd1421216-kube-api-access-nl4p6\") pod \"novacell00c0c-account-delete-2nvn6\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.346685 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f85ac3d4-f734-40ad-b192-f4edd1421216-operator-scripts\") pod \"novacell00c0c-account-delete-2nvn6\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.346736 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwtm\" (UniqueName: \"kubernetes.io/projected/9de22683-ed7f-42a2-b24f-8fb00687086b-kube-api-access-rrwtm\") pod \"novaapiec26-account-delete-7nbbh\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.346809 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9de22683-ed7f-42a2-b24f-8fb00687086b-operator-scripts\") pod \"novaapiec26-account-delete-7nbbh\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.347593 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9de22683-ed7f-42a2-b24f-8fb00687086b-operator-scripts\") pod \"novaapiec26-account-delete-7nbbh\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.347821 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f85ac3d4-f734-40ad-b192-f4edd1421216-operator-scripts\") pod \"novacell00c0c-account-delete-2nvn6\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.363550 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwtm\" (UniqueName: \"kubernetes.io/projected/9de22683-ed7f-42a2-b24f-8fb00687086b-kube-api-access-rrwtm\") pod \"novaapiec26-account-delete-7nbbh\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.383121 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4p6\" (UniqueName: \"kubernetes.io/projected/f85ac3d4-f734-40ad-b192-f4edd1421216-kube-api-access-nl4p6\") pod \"novacell00c0c-account-delete-2nvn6\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.408633 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.487156 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.547404 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.854485 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1d291-account-delete-8l6j6"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.871961 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapiec26-account-delete-7nbbh"] Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.885615 4842 generic.go:334] "Generic (PLEG): container finished" podID="a39b42bf-877b-4b5a-b0a3-998aa208a41d" containerID="c3ad2425978b65f6439feb572682b513835f76760e6cefca16113e56c23118f3" exitCode=0 Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.885680 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"a39b42bf-877b-4b5a-b0a3-998aa208a41d","Type":"ContainerDied","Data":"c3ad2425978b65f6439feb572682b513835f76760e6cefca16113e56c23118f3"} Mar 11 19:19:13 crc kubenswrapper[4842]: W0311 19:19:13.895132 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9de22683_ed7f_42a2_b24f_8fb00687086b.slice/crio-1f69201016d0c442fa393d9564abaffce0eb26e08b28bd04edca31d54b89eb9e WatchSource:0}: Error finding container 1f69201016d0c442fa393d9564abaffce0eb26e08b28bd04edca31d54b89eb9e: Status 404 returned error can't find the container with id 1f69201016d0c442fa393d9564abaffce0eb26e08b28bd04edca31d54b89eb9e Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.895839 4842 generic.go:334] "Generic (PLEG): container finished" podID="c31eb1f5-ab2d-48d6-82d8-6af5678a670d" containerID="df1ce42939844cc243229418a966ebb2f97994c12840e662c2d9a0ccb21bbd50" exitCode=0 Mar 11 19:19:13 crc kubenswrapper[4842]: I0311 19:19:13.895872 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"c31eb1f5-ab2d-48d6-82d8-6af5678a670d","Type":"ContainerDied","Data":"df1ce42939844cc243229418a966ebb2f97994c12840e662c2d9a0ccb21bbd50"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.005806 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell00c0c-account-delete-2nvn6"] Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.146549 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.162950 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b42bf-877b-4b5a-b0a3-998aa208a41d-config-data\") pod \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.163087 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87kjf\" (UniqueName: \"kubernetes.io/projected/a39b42bf-877b-4b5a-b0a3-998aa208a41d-kube-api-access-87kjf\") pod \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\" (UID: \"a39b42bf-877b-4b5a-b0a3-998aa208a41d\") " Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.216153 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39b42bf-877b-4b5a-b0a3-998aa208a41d-kube-api-access-87kjf" (OuterVolumeSpecName: "kube-api-access-87kjf") pod "a39b42bf-877b-4b5a-b0a3-998aa208a41d" (UID: "a39b42bf-877b-4b5a-b0a3-998aa208a41d"). InnerVolumeSpecName "kube-api-access-87kjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.223527 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39b42bf-877b-4b5a-b0a3-998aa208a41d-config-data" (OuterVolumeSpecName: "config-data") pod "a39b42bf-877b-4b5a-b0a3-998aa208a41d" (UID: "a39b42bf-877b-4b5a-b0a3-998aa208a41d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.241403 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.265233 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw7xq\" (UniqueName: \"kubernetes.io/projected/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-kube-api-access-gw7xq\") pod \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.265386 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-config-data\") pod \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\" (UID: \"c31eb1f5-ab2d-48d6-82d8-6af5678a670d\") " Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.265685 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39b42bf-877b-4b5a-b0a3-998aa208a41d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.265703 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87kjf\" (UniqueName: \"kubernetes.io/projected/a39b42bf-877b-4b5a-b0a3-998aa208a41d-kube-api-access-87kjf\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.277599 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-kube-api-access-gw7xq" (OuterVolumeSpecName: "kube-api-access-gw7xq") pod "c31eb1f5-ab2d-48d6-82d8-6af5678a670d" (UID: "c31eb1f5-ab2d-48d6-82d8-6af5678a670d"). InnerVolumeSpecName "kube-api-access-gw7xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.303585 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-config-data" (OuterVolumeSpecName: "config-data") pod "c31eb1f5-ab2d-48d6-82d8-6af5678a670d" (UID: "c31eb1f5-ab2d-48d6-82d8-6af5678a670d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.367154 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw7xq\" (UniqueName: \"kubernetes.io/projected/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-kube-api-access-gw7xq\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.367195 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31eb1f5-ab2d-48d6-82d8-6af5678a670d-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.895054 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.906291 4842 generic.go:334] "Generic (PLEG): container finished" podID="6c49e464-bc56-4675-a6e9-9e5997a85430" containerID="9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0" exitCode=0 Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.906342 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.906349 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"6c49e464-bc56-4675-a6e9-9e5997a85430","Type":"ContainerDied","Data":"9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.906467 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"6c49e464-bc56-4675-a6e9-9e5997a85430","Type":"ContainerDied","Data":"3ef4eab3428a2d27404c1b003f56340e8c846a9b371062cec5c0dbd5a49a7505"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.906485 4842 scope.go:117] "RemoveContainer" containerID="9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.909422 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"c31eb1f5-ab2d-48d6-82d8-6af5678a670d","Type":"ContainerDied","Data":"59426aec751321b417b21dcdb2bddada7577234358debe06a9fa058bd1bb9102"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.909435 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.910659 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"a39b42bf-877b-4b5a-b0a3-998aa208a41d","Type":"ContainerDied","Data":"e326c85627a13d9d98ddcdb6c7c68f88a86c5eaffdd574fc74729219565f4bad"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.910794 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.913739 4842 generic.go:334] "Generic (PLEG): container finished" podID="9de22683-ed7f-42a2-b24f-8fb00687086b" containerID="88d564381d7f26e27d1b3d0a05c9b10613f2d4f1e17139990fda10372a35824b" exitCode=0 Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.913832 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" event={"ID":"9de22683-ed7f-42a2-b24f-8fb00687086b","Type":"ContainerDied","Data":"88d564381d7f26e27d1b3d0a05c9b10613f2d4f1e17139990fda10372a35824b"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.913858 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" event={"ID":"9de22683-ed7f-42a2-b24f-8fb00687086b","Type":"ContainerStarted","Data":"1f69201016d0c442fa393d9564abaffce0eb26e08b28bd04edca31d54b89eb9e"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.915656 4842 generic.go:334] "Generic (PLEG): container finished" podID="abf72003-0177-400a-aa79-5f7d957ae91c" containerID="d3968f4d3635345512a39a183159f9905cf7eb460774c862344b832cffb1b77f" exitCode=0 Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.915728 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" event={"ID":"abf72003-0177-400a-aa79-5f7d957ae91c","Type":"ContainerDied","Data":"d3968f4d3635345512a39a183159f9905cf7eb460774c862344b832cffb1b77f"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.915756 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" event={"ID":"abf72003-0177-400a-aa79-5f7d957ae91c","Type":"ContainerStarted","Data":"8249ee4ecc180da6274f06745b65b435e0e8a1dbc476c4a9ffff815d8abd0608"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.918119 4842 generic.go:334] "Generic (PLEG): container finished" podID="f85ac3d4-f734-40ad-b192-f4edd1421216" containerID="2ba1fa8aecca2b976ca1500d29b35e961cc54ad785288f18ad3d9b7c269a093e" exitCode=0 Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.918150 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" event={"ID":"f85ac3d4-f734-40ad-b192-f4edd1421216","Type":"ContainerDied","Data":"2ba1fa8aecca2b976ca1500d29b35e961cc54ad785288f18ad3d9b7c269a093e"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.918167 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" event={"ID":"f85ac3d4-f734-40ad-b192-f4edd1421216","Type":"ContainerStarted","Data":"baf5d34cf12586d5ae5d74d045455b22abe6e9030d423a66899ea65028683c85"} Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.932985 4842 scope.go:117] "RemoveContainer" containerID="9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0" Mar 11 19:19:14 crc kubenswrapper[4842]: E0311 19:19:14.933554 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0\": container with ID starting with 9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0 not found: ID does not exist" containerID="9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.933600 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0"} err="failed to get container status \"9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0\": rpc error: code = NotFound desc = could not find container \"9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0\": container with ID starting with 9ac377084db71d9e0a837f508318d93f570bb502ae7f039f057f250ca5f110f0 not found: ID does not exist" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.933630 4842 scope.go:117] "RemoveContainer" containerID="df1ce42939844cc243229418a966ebb2f97994c12840e662c2d9a0ccb21bbd50" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.957306 4842 scope.go:117] "RemoveContainer" containerID="c3ad2425978b65f6439feb572682b513835f76760e6cefca16113e56c23118f3" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.979866 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04abba73-25fa-4cda-b3cc-6a2cc23a769b" path="/var/lib/kubelet/pods/04abba73-25fa-4cda-b3cc-6a2cc23a769b/volumes" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.980373 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-592zd\" (UniqueName: \"kubernetes.io/projected/6c49e464-bc56-4675-a6e9-9e5997a85430-kube-api-access-592zd\") pod \"6c49e464-bc56-4675-a6e9-9e5997a85430\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.980742 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c49e464-bc56-4675-a6e9-9e5997a85430-config-data\") pod \"6c49e464-bc56-4675-a6e9-9e5997a85430\" (UID: \"6c49e464-bc56-4675-a6e9-9e5997a85430\") " Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.981069 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d2faaf-7c1c-4d5b-886b-67c259fe8f77" path="/var/lib/kubelet/pods/56d2faaf-7c1c-4d5b-886b-67c259fe8f77/volumes" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.982799 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f373954-daef-4bf6-a56b-7036ad380787" path="/var/lib/kubelet/pods/6f373954-daef-4bf6-a56b-7036ad380787/volumes" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.983526 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca705f61-aadc-49bd-a249-0bd50776875c" path="/var/lib/kubelet/pods/ca705f61-aadc-49bd-a249-0bd50776875c/volumes" Mar 11 19:19:14 crc kubenswrapper[4842]: I0311 19:19:14.992149 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c49e464-bc56-4675-a6e9-9e5997a85430-kube-api-access-592zd" (OuterVolumeSpecName: "kube-api-access-592zd") pod "6c49e464-bc56-4675-a6e9-9e5997a85430" (UID: "6c49e464-bc56-4675-a6e9-9e5997a85430"). InnerVolumeSpecName "kube-api-access-592zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.001218 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.006512 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c49e464-bc56-4675-a6e9-9e5997a85430-config-data" (OuterVolumeSpecName: "config-data") pod "6c49e464-bc56-4675-a6e9-9e5997a85430" (UID: "6c49e464-bc56-4675-a6e9-9e5997a85430"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.010504 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.018746 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.025291 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.070846 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.216:8775/\": read tcp 10.217.0.2:45396->10.217.0.216:8775: read: connection reset by peer" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.070900 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.216:8775/\": read tcp 10.217.0.2:45382->10.217.0.216:8775: read: connection reset by peer" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.081998 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c49e464-bc56-4675-a6e9-9e5997a85430-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.082033 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-592zd\" (UniqueName: \"kubernetes.io/projected/6c49e464-bc56-4675-a6e9-9e5997a85430-kube-api-access-592zd\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.303402 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.312727 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.504383 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.591509 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg9r6\" (UniqueName: \"kubernetes.io/projected/a3671efc-bed8-44b2-8663-60692f7a77a6-kube-api-access-bg9r6\") pod \"a3671efc-bed8-44b2-8663-60692f7a77a6\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.591672 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3671efc-bed8-44b2-8663-60692f7a77a6-config-data\") pod \"a3671efc-bed8-44b2-8663-60692f7a77a6\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.592027 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3671efc-bed8-44b2-8663-60692f7a77a6-logs\") pod \"a3671efc-bed8-44b2-8663-60692f7a77a6\" (UID: \"a3671efc-bed8-44b2-8663-60692f7a77a6\") " Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.592446 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3671efc-bed8-44b2-8663-60692f7a77a6-logs" (OuterVolumeSpecName: "logs") pod "a3671efc-bed8-44b2-8663-60692f7a77a6" (UID: "a3671efc-bed8-44b2-8663-60692f7a77a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.596965 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3671efc-bed8-44b2-8663-60692f7a77a6-kube-api-access-bg9r6" (OuterVolumeSpecName: "kube-api-access-bg9r6") pod "a3671efc-bed8-44b2-8663-60692f7a77a6" (UID: "a3671efc-bed8-44b2-8663-60692f7a77a6"). InnerVolumeSpecName "kube-api-access-bg9r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.618085 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3671efc-bed8-44b2-8663-60692f7a77a6-config-data" (OuterVolumeSpecName: "config-data") pod "a3671efc-bed8-44b2-8663-60692f7a77a6" (UID: "a3671efc-bed8-44b2-8663-60692f7a77a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.694611 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg9r6\" (UniqueName: \"kubernetes.io/projected/a3671efc-bed8-44b2-8663-60692f7a77a6-kube-api-access-bg9r6\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.694663 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3671efc-bed8-44b2-8663-60692f7a77a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.694677 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3671efc-bed8-44b2-8663-60692f7a77a6-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.929065 4842 generic.go:334] "Generic (PLEG): container finished" podID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerID="688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c" exitCode=0 Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.929156 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"a3671efc-bed8-44b2-8663-60692f7a77a6","Type":"ContainerDied","Data":"688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c"} Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.929172 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.930561 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"a3671efc-bed8-44b2-8663-60692f7a77a6","Type":"ContainerDied","Data":"08fcfc73d689df29bb08685af1c6e1136cc6eacd46b9e471e0948b96ef31e622"} Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.930616 4842 scope.go:117] "RemoveContainer" containerID="688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.978983 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.981890 4842 scope.go:117] "RemoveContainer" containerID="11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2" Mar 11 19:19:15 crc kubenswrapper[4842]: I0311 19:19:15.987034 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.004581 4842 scope.go:117] "RemoveContainer" containerID="688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c" Mar 11 19:19:16 crc kubenswrapper[4842]: E0311 19:19:16.005068 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c\": container with ID starting with 688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c not found: ID does not exist" containerID="688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.005099 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c"} err="failed to get container status \"688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c\": rpc error: code = NotFound desc = could not find container \"688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c\": container with ID starting with 688636194afc93fbbb4055f0a34e73a3bb63b61eedf34a1c1c3c353bc1ca8a4c not found: ID does not exist" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.005118 4842 scope.go:117] "RemoveContainer" containerID="11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2" Mar 11 19:19:16 crc kubenswrapper[4842]: E0311 19:19:16.005454 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2\": container with ID starting with 11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2 not found: ID does not exist" containerID="11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.005480 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2"} err="failed to get container status \"11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2\": rpc error: code = NotFound desc = could not find container \"11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2\": container with ID starting with 11cf697d9c3f5e0b69a29b0c92e54e631c79af809e370ddf669a0cbd68f3bbc2 not found: ID does not exist" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.218422 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.302951 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl4p6\" (UniqueName: \"kubernetes.io/projected/f85ac3d4-f734-40ad-b192-f4edd1421216-kube-api-access-nl4p6\") pod \"f85ac3d4-f734-40ad-b192-f4edd1421216\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.303033 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f85ac3d4-f734-40ad-b192-f4edd1421216-operator-scripts\") pod \"f85ac3d4-f734-40ad-b192-f4edd1421216\" (UID: \"f85ac3d4-f734-40ad-b192-f4edd1421216\") " Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.303743 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f85ac3d4-f734-40ad-b192-f4edd1421216-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f85ac3d4-f734-40ad-b192-f4edd1421216" (UID: "f85ac3d4-f734-40ad-b192-f4edd1421216"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.310455 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85ac3d4-f734-40ad-b192-f4edd1421216-kube-api-access-nl4p6" (OuterVolumeSpecName: "kube-api-access-nl4p6") pod "f85ac3d4-f734-40ad-b192-f4edd1421216" (UID: "f85ac3d4-f734-40ad-b192-f4edd1421216"). InnerVolumeSpecName "kube-api-access-nl4p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.367077 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.381463 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.404599 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abf72003-0177-400a-aa79-5f7d957ae91c-operator-scripts\") pod \"abf72003-0177-400a-aa79-5f7d957ae91c\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.404686 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrwtm\" (UniqueName: \"kubernetes.io/projected/9de22683-ed7f-42a2-b24f-8fb00687086b-kube-api-access-rrwtm\") pod \"9de22683-ed7f-42a2-b24f-8fb00687086b\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.404741 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9de22683-ed7f-42a2-b24f-8fb00687086b-operator-scripts\") pod \"9de22683-ed7f-42a2-b24f-8fb00687086b\" (UID: \"9de22683-ed7f-42a2-b24f-8fb00687086b\") " Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.404791 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v9pj\" (UniqueName: \"kubernetes.io/projected/abf72003-0177-400a-aa79-5f7d957ae91c-kube-api-access-4v9pj\") pod \"abf72003-0177-400a-aa79-5f7d957ae91c\" (UID: \"abf72003-0177-400a-aa79-5f7d957ae91c\") " Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.405149 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl4p6\" (UniqueName: \"kubernetes.io/projected/f85ac3d4-f734-40ad-b192-f4edd1421216-kube-api-access-nl4p6\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.405171 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f85ac3d4-f734-40ad-b192-f4edd1421216-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.408402 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de22683-ed7f-42a2-b24f-8fb00687086b-kube-api-access-rrwtm" (OuterVolumeSpecName: "kube-api-access-rrwtm") pod "9de22683-ed7f-42a2-b24f-8fb00687086b" (UID: "9de22683-ed7f-42a2-b24f-8fb00687086b"). InnerVolumeSpecName "kube-api-access-rrwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.408795 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf72003-0177-400a-aa79-5f7d957ae91c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "abf72003-0177-400a-aa79-5f7d957ae91c" (UID: "abf72003-0177-400a-aa79-5f7d957ae91c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.409043 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de22683-ed7f-42a2-b24f-8fb00687086b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9de22683-ed7f-42a2-b24f-8fb00687086b" (UID: "9de22683-ed7f-42a2-b24f-8fb00687086b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.412445 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf72003-0177-400a-aa79-5f7d957ae91c-kube-api-access-4v9pj" (OuterVolumeSpecName: "kube-api-access-4v9pj") pod "abf72003-0177-400a-aa79-5f7d957ae91c" (UID: "abf72003-0177-400a-aa79-5f7d957ae91c"). InnerVolumeSpecName "kube-api-access-4v9pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.506968 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9de22683-ed7f-42a2-b24f-8fb00687086b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.507017 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v9pj\" (UniqueName: \"kubernetes.io/projected/abf72003-0177-400a-aa79-5f7d957ae91c-kube-api-access-4v9pj\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.507032 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abf72003-0177-400a-aa79-5f7d957ae91c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.507042 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrwtm\" (UniqueName: \"kubernetes.io/projected/9de22683-ed7f-42a2-b24f-8fb00687086b-kube-api-access-rrwtm\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.943876 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" event={"ID":"9de22683-ed7f-42a2-b24f-8fb00687086b","Type":"ContainerDied","Data":"1f69201016d0c442fa393d9564abaffce0eb26e08b28bd04edca31d54b89eb9e"} Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.943902 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapiec26-account-delete-7nbbh" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.943914 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f69201016d0c442fa393d9564abaffce0eb26e08b28bd04edca31d54b89eb9e" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.945402 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" event={"ID":"abf72003-0177-400a-aa79-5f7d957ae91c","Type":"ContainerDied","Data":"8249ee4ecc180da6274f06745b65b435e0e8a1dbc476c4a9ffff815d8abd0608"} Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.945423 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8249ee4ecc180da6274f06745b65b435e0e8a1dbc476c4a9ffff815d8abd0608" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.945439 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1d291-account-delete-8l6j6" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.947023 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" event={"ID":"f85ac3d4-f734-40ad-b192-f4edd1421216","Type":"ContainerDied","Data":"baf5d34cf12586d5ae5d74d045455b22abe6e9030d423a66899ea65028683c85"} Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.947042 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf5d34cf12586d5ae5d74d045455b22abe6e9030d423a66899ea65028683c85" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.947073 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00c0c-account-delete-2nvn6" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.974572 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c49e464-bc56-4675-a6e9-9e5997a85430" path="/var/lib/kubelet/pods/6c49e464-bc56-4675-a6e9-9e5997a85430/volumes" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.975248 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" path="/var/lib/kubelet/pods/a3671efc-bed8-44b2-8663-60692f7a77a6/volumes" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.975883 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39b42bf-877b-4b5a-b0a3-998aa208a41d" path="/var/lib/kubelet/pods/a39b42bf-877b-4b5a-b0a3-998aa208a41d/volumes" Mar 11 19:19:16 crc kubenswrapper[4842]: I0311 19:19:16.976918 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31eb1f5-ab2d-48d6-82d8-6af5678a670d" path="/var/lib/kubelet/pods/c31eb1f5-ab2d-48d6-82d8-6af5678a670d/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.115080 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-dcjpk"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.124520 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-dcjpk"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.139676 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell1d291-account-delete-8l6j6"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.147023 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.154700 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-d291-account-create-update-glvdn"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.163520 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell1d291-account-delete-8l6j6"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.210466 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gg2hz"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.231119 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gg2hz"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.241345 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell00c0c-account-delete-2nvn6"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.247833 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell00c0c-account-delete-2nvn6"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.254557 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.260677 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-0c0c-account-create-update-d76cc"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.316205 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-db-create-s8bgj"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.328324 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-db-create-s8bgj"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.336954 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novaapiec26-account-delete-7nbbh"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.346316 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.355335 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-ec26-account-create-update-9ckbs"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.368347 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novaapiec26-account-delete-7nbbh"] Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.970129 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1137b12c-774f-434b-9d92-a7d5b6ee6ef9" path="/var/lib/kubelet/pods/1137b12c-774f-434b-9d92-a7d5b6ee6ef9/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.970767 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43cc024c-6b02-4efc-b7aa-7b1ec6785123" path="/var/lib/kubelet/pods/43cc024c-6b02-4efc-b7aa-7b1ec6785123/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.971232 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986d2be1-4400-40e0-8af9-9bb831ca357c" path="/var/lib/kubelet/pods/986d2be1-4400-40e0-8af9-9bb831ca357c/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.971678 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de22683-ed7f-42a2-b24f-8fb00687086b" path="/var/lib/kubelet/pods/9de22683-ed7f-42a2-b24f-8fb00687086b/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.972571 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4e3090-6260-429d-a8cc-ff5ec73181ea" path="/var/lib/kubelet/pods/9e4e3090-6260-429d-a8cc-ff5ec73181ea/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.973014 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf72003-0177-400a-aa79-5f7d957ae91c" path="/var/lib/kubelet/pods/abf72003-0177-400a-aa79-5f7d957ae91c/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.973522 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb9b95ff-e9b4-4df9-895c-172bb594b59e" path="/var/lib/kubelet/pods/bb9b95ff-e9b4-4df9-895c-172bb594b59e/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.974391 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1668ef-02d4-47cd-a896-5769784534bc" path="/var/lib/kubelet/pods/ea1668ef-02d4-47cd-a896-5769784534bc/volumes" Mar 11 19:19:18 crc kubenswrapper[4842]: I0311 19:19:18.974808 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85ac3d4-f734-40ad-b192-f4edd1421216" path="/var/lib/kubelet/pods/f85ac3d4-f734-40ad-b192-f4edd1421216/volumes" Mar 11 19:19:19 crc kubenswrapper[4842]: I0311 19:19:19.199734 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="c31eb1f5-ab2d-48d6-82d8-6af5678a670d" containerName="nova-kuttl-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.0.208:6080/vnc_lite.html\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223321 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-db-create-5cj2c"] Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223672 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c49e464-bc56-4675-a6e9-9e5997a85430" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223690 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c49e464-bc56-4675-a6e9-9e5997a85430" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223710 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31eb1f5-ab2d-48d6-82d8-6af5678a670d" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223717 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31eb1f5-ab2d-48d6-82d8-6af5678a670d" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223737 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223748 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223758 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de22683-ed7f-42a2-b24f-8fb00687086b" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223764 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de22683-ed7f-42a2-b24f-8fb00687086b" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223777 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39b42bf-877b-4b5a-b0a3-998aa208a41d" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223783 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39b42bf-877b-4b5a-b0a3-998aa208a41d" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223797 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf72003-0177-400a-aa79-5f7d957ae91c" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223804 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf72003-0177-400a-aa79-5f7d957ae91c" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223815 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85ac3d4-f734-40ad-b192-f4edd1421216" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223822 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85ac3d4-f734-40ad-b192-f4edd1421216" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: E0311 19:19:20.223831 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-log" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223837 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-log" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223963 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf72003-0177-400a-aa79-5f7d957ae91c" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223976 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31eb1f5-ab2d-48d6-82d8-6af5678a670d" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.223994 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de22683-ed7f-42a2-b24f-8fb00687086b" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.224005 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.224017 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85ac3d4-f734-40ad-b192-f4edd1421216" containerName="mariadb-account-delete" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.224027 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39b42bf-877b-4b5a-b0a3-998aa208a41d" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.224034 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c49e464-bc56-4675-a6e9-9e5997a85430" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.224041 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3671efc-bed8-44b2-8663-60692f7a77a6" containerName="nova-kuttl-metadata-log" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.224585 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.235826 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-5cj2c"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.268003 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3099459d-b582-437c-85a8-2e10562224c9-operator-scripts\") pod \"nova-api-db-create-5cj2c\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.268152 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhzv\" (UniqueName: \"kubernetes.io/projected/3099459d-b582-437c-85a8-2e10562224c9-kube-api-access-qvhzv\") pod \"nova-api-db-create-5cj2c\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.315471 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gttk6"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.316675 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.335767 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gttk6"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.369317 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2r6\" (UniqueName: \"kubernetes.io/projected/7c323f17-b38b-4762-b699-0f53719ebe74-kube-api-access-pm2r6\") pod \"nova-cell0-db-create-gttk6\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.369369 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c323f17-b38b-4762-b699-0f53719ebe74-operator-scripts\") pod \"nova-cell0-db-create-gttk6\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.369441 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhzv\" (UniqueName: \"kubernetes.io/projected/3099459d-b582-437c-85a8-2e10562224c9-kube-api-access-qvhzv\") pod \"nova-api-db-create-5cj2c\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.369473 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3099459d-b582-437c-85a8-2e10562224c9-operator-scripts\") pod \"nova-api-db-create-5cj2c\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.370212 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3099459d-b582-437c-85a8-2e10562224c9-operator-scripts\") pod \"nova-api-db-create-5cj2c\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.398034 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhzv\" (UniqueName: \"kubernetes.io/projected/3099459d-b582-437c-85a8-2e10562224c9-kube-api-access-qvhzv\") pod \"nova-api-db-create-5cj2c\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.425596 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-b1de-account-create-update-tdww2"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.426483 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.430697 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-api-db-secret" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.437711 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-b1de-account-create-update-tdww2"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.470661 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2r6\" (UniqueName: \"kubernetes.io/projected/7c323f17-b38b-4762-b699-0f53719ebe74-kube-api-access-pm2r6\") pod \"nova-cell0-db-create-gttk6\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.470712 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c323f17-b38b-4762-b699-0f53719ebe74-operator-scripts\") pod \"nova-cell0-db-create-gttk6\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.470822 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32dddeff-d94c-4d7f-978a-d6930ee4d555-operator-scripts\") pod \"nova-api-b1de-account-create-update-tdww2\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.470855 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwcxs\" (UniqueName: \"kubernetes.io/projected/32dddeff-d94c-4d7f-978a-d6930ee4d555-kube-api-access-mwcxs\") pod \"nova-api-b1de-account-create-update-tdww2\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.471525 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c323f17-b38b-4762-b699-0f53719ebe74-operator-scripts\") pod \"nova-cell0-db-create-gttk6\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.486467 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2r6\" (UniqueName: \"kubernetes.io/projected/7c323f17-b38b-4762-b699-0f53719ebe74-kube-api-access-pm2r6\") pod \"nova-cell0-db-create-gttk6\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.517970 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-p2qrl"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.518883 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.531479 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-p2qrl"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.540542 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.571662 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32dddeff-d94c-4d7f-978a-d6930ee4d555-operator-scripts\") pod \"nova-api-b1de-account-create-update-tdww2\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.571709 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwcxs\" (UniqueName: \"kubernetes.io/projected/32dddeff-d94c-4d7f-978a-d6930ee4d555-kube-api-access-mwcxs\") pod \"nova-api-b1de-account-create-update-tdww2\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.571759 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrkwh\" (UniqueName: \"kubernetes.io/projected/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-kube-api-access-hrkwh\") pod \"nova-cell1-db-create-p2qrl\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.571783 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-operator-scripts\") pod \"nova-cell1-db-create-p2qrl\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.572489 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32dddeff-d94c-4d7f-978a-d6930ee4d555-operator-scripts\") pod \"nova-api-b1de-account-create-update-tdww2\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.594472 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwcxs\" (UniqueName: \"kubernetes.io/projected/32dddeff-d94c-4d7f-978a-d6930ee4d555-kube-api-access-mwcxs\") pod \"nova-api-b1de-account-create-update-tdww2\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.612801 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.613932 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.621628 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell0-db-secret" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.638061 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.640292 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.673197 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1550b18b-bd82-4f4e-a758-418ebc45100e-operator-scripts\") pod \"nova-cell0-0a69-account-create-update-gp59s\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.673669 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrkwh\" (UniqueName: \"kubernetes.io/projected/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-kube-api-access-hrkwh\") pod \"nova-cell1-db-create-p2qrl\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.673722 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-operator-scripts\") pod \"nova-cell1-db-create-p2qrl\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.674040 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlj7v\" (UniqueName: \"kubernetes.io/projected/1550b18b-bd82-4f4e-a758-418ebc45100e-kube-api-access-dlj7v\") pod \"nova-cell0-0a69-account-create-update-gp59s\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.675719 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-operator-scripts\") pod \"nova-cell1-db-create-p2qrl\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.690261 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrkwh\" (UniqueName: \"kubernetes.io/projected/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-kube-api-access-hrkwh\") pod \"nova-cell1-db-create-p2qrl\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.743236 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.777288 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlj7v\" (UniqueName: \"kubernetes.io/projected/1550b18b-bd82-4f4e-a758-418ebc45100e-kube-api-access-dlj7v\") pod \"nova-cell0-0a69-account-create-update-gp59s\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.777431 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1550b18b-bd82-4f4e-a758-418ebc45100e-operator-scripts\") pod \"nova-cell0-0a69-account-create-update-gp59s\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.778169 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1550b18b-bd82-4f4e-a758-418ebc45100e-operator-scripts\") pod \"nova-cell0-0a69-account-create-update-gp59s\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.811785 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlj7v\" (UniqueName: \"kubernetes.io/projected/1550b18b-bd82-4f4e-a758-418ebc45100e-kube-api-access-dlj7v\") pod \"nova-cell0-0a69-account-create-update-gp59s\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.828678 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.831960 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.833887 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell1-db-secret" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.840163 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw"] Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.846417 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.879487 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2nl\" (UniqueName: \"kubernetes.io/projected/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-kube-api-access-6j2nl\") pod \"nova-cell1-882e-account-create-update-vgnhw\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.879732 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-operator-scripts\") pod \"nova-cell1-882e-account-create-update-vgnhw\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.966732 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.980790 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-operator-scripts\") pod \"nova-cell1-882e-account-create-update-vgnhw\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.980900 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2nl\" (UniqueName: \"kubernetes.io/projected/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-kube-api-access-6j2nl\") pod \"nova-cell1-882e-account-create-update-vgnhw\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:20 crc kubenswrapper[4842]: I0311 19:19:20.981348 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-operator-scripts\") pod \"nova-cell1-882e-account-create-update-vgnhw\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:20.999224 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2nl\" (UniqueName: \"kubernetes.io/projected/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-kube-api-access-6j2nl\") pod \"nova-cell1-882e-account-create-update-vgnhw\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.012371 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-5cj2c"] Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.099629 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gttk6"] Mar 11 19:19:21 crc kubenswrapper[4842]: W0311 19:19:21.118213 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c323f17_b38b_4762_b699_0f53719ebe74.slice/crio-59272b8356c23bfaf0ea4dd08b5db2703e25a04cfb4b1913317fc267e6cbf0f6 WatchSource:0}: Error finding container 59272b8356c23bfaf0ea4dd08b5db2703e25a04cfb4b1913317fc267e6cbf0f6: Status 404 returned error can't find the container with id 59272b8356c23bfaf0ea4dd08b5db2703e25a04cfb4b1913317fc267e6cbf0f6 Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.157822 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.219761 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-b1de-account-create-update-tdww2"] Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.298481 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-p2qrl"] Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.429474 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s"] Mar 11 19:19:21 crc kubenswrapper[4842]: W0311 19:19:21.432012 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1550b18b_bd82_4f4e_a758_418ebc45100e.slice/crio-037eb1ea27793114a30e6c29b5f46d21fb76aecc9afe5d6fe98d3c59e0807326 WatchSource:0}: Error finding container 037eb1ea27793114a30e6c29b5f46d21fb76aecc9afe5d6fe98d3c59e0807326: Status 404 returned error can't find the container with id 037eb1ea27793114a30e6c29b5f46d21fb76aecc9afe5d6fe98d3c59e0807326 Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.615347 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw"] Mar 11 19:19:21 crc kubenswrapper[4842]: W0311 19:19:21.634231 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41b0dd88_16ef_4dc6_9f3b_a276fd87c154.slice/crio-e2c789caf693ce3e85e5f3b9fb8735db0187267e1a129596df1f474c922f309a WatchSource:0}: Error finding container e2c789caf693ce3e85e5f3b9fb8735db0187267e1a129596df1f474c922f309a: Status 404 returned error can't find the container with id e2c789caf693ce3e85e5f3b9fb8735db0187267e1a129596df1f474c922f309a Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.996879 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" event={"ID":"1550b18b-bd82-4f4e-a758-418ebc45100e","Type":"ContainerStarted","Data":"775646a243bc97c0eb00108bf2c234d3c6d7cefa0f6f992ab050520c7f84b32f"} Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.996934 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" event={"ID":"1550b18b-bd82-4f4e-a758-418ebc45100e","Type":"ContainerStarted","Data":"037eb1ea27793114a30e6c29b5f46d21fb76aecc9afe5d6fe98d3c59e0807326"} Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.998930 4842 generic.go:334] "Generic (PLEG): container finished" podID="32dddeff-d94c-4d7f-978a-d6930ee4d555" containerID="95be4b65cec964953b8f7bbbc204b195c77a8b2878efc486527749b9f66bcac0" exitCode=0 Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.999019 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" event={"ID":"32dddeff-d94c-4d7f-978a-d6930ee4d555","Type":"ContainerDied","Data":"95be4b65cec964953b8f7bbbc204b195c77a8b2878efc486527749b9f66bcac0"} Mar 11 19:19:21 crc kubenswrapper[4842]: I0311 19:19:21.999038 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" event={"ID":"32dddeff-d94c-4d7f-978a-d6930ee4d555","Type":"ContainerStarted","Data":"70664ed6d7e5850c28d95844df24061c4b9817266a8248da3c918e4d1280f628"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.001578 4842 generic.go:334] "Generic (PLEG): container finished" podID="3099459d-b582-437c-85a8-2e10562224c9" containerID="419057b210958847d4c9e7479bb1c4fbc48301a95ddb46a8ad9cb915ee3442c1" exitCode=0 Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.001670 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-5cj2c" event={"ID":"3099459d-b582-437c-85a8-2e10562224c9","Type":"ContainerDied","Data":"419057b210958847d4c9e7479bb1c4fbc48301a95ddb46a8ad9cb915ee3442c1"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.001938 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-5cj2c" event={"ID":"3099459d-b582-437c-85a8-2e10562224c9","Type":"ContainerStarted","Data":"b3e53673cacd664c04540fe390db043cbd4f4cfc7d88e5982fee5918e3706e4f"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.008633 4842 generic.go:334] "Generic (PLEG): container finished" podID="7c323f17-b38b-4762-b699-0f53719ebe74" containerID="e21f8e210fa7b0832f3f7981d35cb864a41246531795e7de2350e86a8f09a448" exitCode=0 Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.008858 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-gttk6" event={"ID":"7c323f17-b38b-4762-b699-0f53719ebe74","Type":"ContainerDied","Data":"e21f8e210fa7b0832f3f7981d35cb864a41246531795e7de2350e86a8f09a448"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.008904 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-gttk6" event={"ID":"7c323f17-b38b-4762-b699-0f53719ebe74","Type":"ContainerStarted","Data":"59272b8356c23bfaf0ea4dd08b5db2703e25a04cfb4b1913317fc267e6cbf0f6"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.011583 4842 generic.go:334] "Generic (PLEG): container finished" podID="b4930d1d-6a6a-411f-b0d8-5efb91c732f3" containerID="7f782f0e862a0382c7570ae48b4caf788394118bb273b708be6d0b926d5e9820" exitCode=0 Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.011648 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" event={"ID":"b4930d1d-6a6a-411f-b0d8-5efb91c732f3","Type":"ContainerDied","Data":"7f782f0e862a0382c7570ae48b4caf788394118bb273b708be6d0b926d5e9820"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.011670 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" event={"ID":"b4930d1d-6a6a-411f-b0d8-5efb91c732f3","Type":"ContainerStarted","Data":"60f55ddefe7f6c613e2c2d097704b4e897173f2909ec51987ac40a0af7dad576"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.014012 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" event={"ID":"41b0dd88-16ef-4dc6-9f3b-a276fd87c154","Type":"ContainerStarted","Data":"fa440ff723cf8ad7315429b51c038cdc5747e1375e498ea0f7814f8eb17d80f0"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.014119 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" event={"ID":"41b0dd88-16ef-4dc6-9f3b-a276fd87c154","Type":"ContainerStarted","Data":"e2c789caf693ce3e85e5f3b9fb8735db0187267e1a129596df1f474c922f309a"} Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.023955 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" podStartSLOduration=2.023923473 podStartE2EDuration="2.023923473s" podCreationTimestamp="2026-03-11 19:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:22.017314955 +0000 UTC m=+1807.665011285" watchObservedRunningTime="2026-03-11 19:19:22.023923473 +0000 UTC m=+1807.671619793" Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.075580 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" podStartSLOduration=2.075552216 podStartE2EDuration="2.075552216s" podCreationTimestamp="2026-03-11 19:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:22.066969716 +0000 UTC m=+1807.714666006" watchObservedRunningTime="2026-03-11 19:19:22.075552216 +0000 UTC m=+1807.723248516" Mar 11 19:19:22 crc kubenswrapper[4842]: I0311 19:19:22.961872 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:19:22 crc kubenswrapper[4842]: E0311 19:19:22.962163 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.028917 4842 generic.go:334] "Generic (PLEG): container finished" podID="41b0dd88-16ef-4dc6-9f3b-a276fd87c154" containerID="fa440ff723cf8ad7315429b51c038cdc5747e1375e498ea0f7814f8eb17d80f0" exitCode=0 Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.029038 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" event={"ID":"41b0dd88-16ef-4dc6-9f3b-a276fd87c154","Type":"ContainerDied","Data":"fa440ff723cf8ad7315429b51c038cdc5747e1375e498ea0f7814f8eb17d80f0"} Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.031093 4842 generic.go:334] "Generic (PLEG): container finished" podID="1550b18b-bd82-4f4e-a758-418ebc45100e" containerID="775646a243bc97c0eb00108bf2c234d3c6d7cefa0f6f992ab050520c7f84b32f" exitCode=0 Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.031127 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" event={"ID":"1550b18b-bd82-4f4e-a758-418ebc45100e","Type":"ContainerDied","Data":"775646a243bc97c0eb00108bf2c234d3c6d7cefa0f6f992ab050520c7f84b32f"} Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.398050 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.439982 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c323f17-b38b-4762-b699-0f53719ebe74-operator-scripts\") pod \"7c323f17-b38b-4762-b699-0f53719ebe74\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.440038 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm2r6\" (UniqueName: \"kubernetes.io/projected/7c323f17-b38b-4762-b699-0f53719ebe74-kube-api-access-pm2r6\") pod \"7c323f17-b38b-4762-b699-0f53719ebe74\" (UID: \"7c323f17-b38b-4762-b699-0f53719ebe74\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.441124 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c323f17-b38b-4762-b699-0f53719ebe74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c323f17-b38b-4762-b699-0f53719ebe74" (UID: "7c323f17-b38b-4762-b699-0f53719ebe74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.446335 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c323f17-b38b-4762-b699-0f53719ebe74-kube-api-access-pm2r6" (OuterVolumeSpecName: "kube-api-access-pm2r6") pod "7c323f17-b38b-4762-b699-0f53719ebe74" (UID: "7c323f17-b38b-4762-b699-0f53719ebe74"). InnerVolumeSpecName "kube-api-access-pm2r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.542584 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c323f17-b38b-4762-b699-0f53719ebe74-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.542635 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm2r6\" (UniqueName: \"kubernetes.io/projected/7c323f17-b38b-4762-b699-0f53719ebe74-kube-api-access-pm2r6\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.566484 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.572314 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.577526 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.643764 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-operator-scripts\") pod \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.643858 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvhzv\" (UniqueName: \"kubernetes.io/projected/3099459d-b582-437c-85a8-2e10562224c9-kube-api-access-qvhzv\") pod \"3099459d-b582-437c-85a8-2e10562224c9\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.643892 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrkwh\" (UniqueName: \"kubernetes.io/projected/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-kube-api-access-hrkwh\") pod \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\" (UID: \"b4930d1d-6a6a-411f-b0d8-5efb91c732f3\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.643945 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwcxs\" (UniqueName: \"kubernetes.io/projected/32dddeff-d94c-4d7f-978a-d6930ee4d555-kube-api-access-mwcxs\") pod \"32dddeff-d94c-4d7f-978a-d6930ee4d555\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.644055 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32dddeff-d94c-4d7f-978a-d6930ee4d555-operator-scripts\") pod \"32dddeff-d94c-4d7f-978a-d6930ee4d555\" (UID: \"32dddeff-d94c-4d7f-978a-d6930ee4d555\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.644106 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3099459d-b582-437c-85a8-2e10562224c9-operator-scripts\") pod \"3099459d-b582-437c-85a8-2e10562224c9\" (UID: \"3099459d-b582-437c-85a8-2e10562224c9\") " Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.644248 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4930d1d-6a6a-411f-b0d8-5efb91c732f3" (UID: "b4930d1d-6a6a-411f-b0d8-5efb91c732f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.644512 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.644765 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32dddeff-d94c-4d7f-978a-d6930ee4d555-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32dddeff-d94c-4d7f-978a-d6930ee4d555" (UID: "32dddeff-d94c-4d7f-978a-d6930ee4d555"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.644784 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3099459d-b582-437c-85a8-2e10562224c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3099459d-b582-437c-85a8-2e10562224c9" (UID: "3099459d-b582-437c-85a8-2e10562224c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.647467 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dddeff-d94c-4d7f-978a-d6930ee4d555-kube-api-access-mwcxs" (OuterVolumeSpecName: "kube-api-access-mwcxs") pod "32dddeff-d94c-4d7f-978a-d6930ee4d555" (UID: "32dddeff-d94c-4d7f-978a-d6930ee4d555"). InnerVolumeSpecName "kube-api-access-mwcxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.647498 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-kube-api-access-hrkwh" (OuterVolumeSpecName: "kube-api-access-hrkwh") pod "b4930d1d-6a6a-411f-b0d8-5efb91c732f3" (UID: "b4930d1d-6a6a-411f-b0d8-5efb91c732f3"). InnerVolumeSpecName "kube-api-access-hrkwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.648633 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3099459d-b582-437c-85a8-2e10562224c9-kube-api-access-qvhzv" (OuterVolumeSpecName: "kube-api-access-qvhzv") pod "3099459d-b582-437c-85a8-2e10562224c9" (UID: "3099459d-b582-437c-85a8-2e10562224c9"). InnerVolumeSpecName "kube-api-access-qvhzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.746020 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvhzv\" (UniqueName: \"kubernetes.io/projected/3099459d-b582-437c-85a8-2e10562224c9-kube-api-access-qvhzv\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.746067 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrkwh\" (UniqueName: \"kubernetes.io/projected/b4930d1d-6a6a-411f-b0d8-5efb91c732f3-kube-api-access-hrkwh\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.746080 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwcxs\" (UniqueName: \"kubernetes.io/projected/32dddeff-d94c-4d7f-978a-d6930ee4d555-kube-api-access-mwcxs\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.746094 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32dddeff-d94c-4d7f-978a-d6930ee4d555-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:23 crc kubenswrapper[4842]: I0311 19:19:23.746108 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3099459d-b582-437c-85a8-2e10562224c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.043238 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-5cj2c" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.043255 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-5cj2c" event={"ID":"3099459d-b582-437c-85a8-2e10562224c9","Type":"ContainerDied","Data":"b3e53673cacd664c04540fe390db043cbd4f4cfc7d88e5982fee5918e3706e4f"} Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.043364 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e53673cacd664c04540fe390db043cbd4f4cfc7d88e5982fee5918e3706e4f" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.046710 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-gttk6" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.046719 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-gttk6" event={"ID":"7c323f17-b38b-4762-b699-0f53719ebe74","Type":"ContainerDied","Data":"59272b8356c23bfaf0ea4dd08b5db2703e25a04cfb4b1913317fc267e6cbf0f6"} Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.047157 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59272b8356c23bfaf0ea4dd08b5db2703e25a04cfb4b1913317fc267e6cbf0f6" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.049408 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" event={"ID":"b4930d1d-6a6a-411f-b0d8-5efb91c732f3","Type":"ContainerDied","Data":"60f55ddefe7f6c613e2c2d097704b4e897173f2909ec51987ac40a0af7dad576"} Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.049506 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f55ddefe7f6c613e2c2d097704b4e897173f2909ec51987ac40a0af7dad576" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.049758 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-p2qrl" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.050760 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.051176 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-b1de-account-create-update-tdww2" event={"ID":"32dddeff-d94c-4d7f-978a-d6930ee4d555","Type":"ContainerDied","Data":"70664ed6d7e5850c28d95844df24061c4b9817266a8248da3c918e4d1280f628"} Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.051198 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70664ed6d7e5850c28d95844df24061c4b9817266a8248da3c918e4d1280f628" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.413323 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.462098 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlj7v\" (UniqueName: \"kubernetes.io/projected/1550b18b-bd82-4f4e-a758-418ebc45100e-kube-api-access-dlj7v\") pod \"1550b18b-bd82-4f4e-a758-418ebc45100e\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.462218 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1550b18b-bd82-4f4e-a758-418ebc45100e-operator-scripts\") pod \"1550b18b-bd82-4f4e-a758-418ebc45100e\" (UID: \"1550b18b-bd82-4f4e-a758-418ebc45100e\") " Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.462938 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1550b18b-bd82-4f4e-a758-418ebc45100e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1550b18b-bd82-4f4e-a758-418ebc45100e" (UID: "1550b18b-bd82-4f4e-a758-418ebc45100e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.467909 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1550b18b-bd82-4f4e-a758-418ebc45100e-kube-api-access-dlj7v" (OuterVolumeSpecName: "kube-api-access-dlj7v") pod "1550b18b-bd82-4f4e-a758-418ebc45100e" (UID: "1550b18b-bd82-4f4e-a758-418ebc45100e"). InnerVolumeSpecName "kube-api-access-dlj7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.516555 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.563195 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-operator-scripts\") pod \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.563276 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j2nl\" (UniqueName: \"kubernetes.io/projected/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-kube-api-access-6j2nl\") pod \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\" (UID: \"41b0dd88-16ef-4dc6-9f3b-a276fd87c154\") " Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.563609 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1550b18b-bd82-4f4e-a758-418ebc45100e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.563624 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlj7v\" (UniqueName: \"kubernetes.io/projected/1550b18b-bd82-4f4e-a758-418ebc45100e-kube-api-access-dlj7v\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.563955 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41b0dd88-16ef-4dc6-9f3b-a276fd87c154" (UID: "41b0dd88-16ef-4dc6-9f3b-a276fd87c154"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.569392 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-kube-api-access-6j2nl" (OuterVolumeSpecName: "kube-api-access-6j2nl") pod "41b0dd88-16ef-4dc6-9f3b-a276fd87c154" (UID: "41b0dd88-16ef-4dc6-9f3b-a276fd87c154"). InnerVolumeSpecName "kube-api-access-6j2nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.664644 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:24 crc kubenswrapper[4842]: I0311 19:19:24.664676 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j2nl\" (UniqueName: \"kubernetes.io/projected/41b0dd88-16ef-4dc6-9f3b-a276fd87c154-kube-api-access-6j2nl\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.069154 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" event={"ID":"41b0dd88-16ef-4dc6-9f3b-a276fd87c154","Type":"ContainerDied","Data":"e2c789caf693ce3e85e5f3b9fb8735db0187267e1a129596df1f474c922f309a"} Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.070014 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2c789caf693ce3e85e5f3b9fb8735db0187267e1a129596df1f474c922f309a" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.069183 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.071042 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" event={"ID":"1550b18b-bd82-4f4e-a758-418ebc45100e","Type":"ContainerDied","Data":"037eb1ea27793114a30e6c29b5f46d21fb76aecc9afe5d6fe98d3c59e0807326"} Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.071084 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="037eb1ea27793114a30e6c29b5f46d21fb76aecc9afe5d6fe98d3c59e0807326" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.071068 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.870614 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2"] Mar 11 19:19:25 crc kubenswrapper[4842]: E0311 19:19:25.871748 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3099459d-b582-437c-85a8-2e10562224c9" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.871787 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3099459d-b582-437c-85a8-2e10562224c9" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: E0311 19:19:25.871812 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c323f17-b38b-4762-b699-0f53719ebe74" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.871824 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c323f17-b38b-4762-b699-0f53719ebe74" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: E0311 19:19:25.871855 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dddeff-d94c-4d7f-978a-d6930ee4d555" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.871866 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dddeff-d94c-4d7f-978a-d6930ee4d555" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: E0311 19:19:25.871886 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4930d1d-6a6a-411f-b0d8-5efb91c732f3" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.871897 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4930d1d-6a6a-411f-b0d8-5efb91c732f3" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: E0311 19:19:25.871921 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1550b18b-bd82-4f4e-a758-418ebc45100e" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.871932 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1550b18b-bd82-4f4e-a758-418ebc45100e" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: E0311 19:19:25.871954 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b0dd88-16ef-4dc6-9f3b-a276fd87c154" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.871964 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b0dd88-16ef-4dc6-9f3b-a276fd87c154" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.872211 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="3099459d-b582-437c-85a8-2e10562224c9" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.872231 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b0dd88-16ef-4dc6-9f3b-a276fd87c154" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.872249 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dddeff-d94c-4d7f-978a-d6930ee4d555" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.872266 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4930d1d-6a6a-411f-b0d8-5efb91c732f3" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.872351 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1550b18b-bd82-4f4e-a758-418ebc45100e" containerName="mariadb-account-create-update" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.872374 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c323f17-b38b-4762-b699-0f53719ebe74" containerName="mariadb-database-create" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.873052 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.875593 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-tr97s" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.875843 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.876198 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-scripts" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.886423 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2"] Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.984899 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szfj\" (UniqueName: \"kubernetes.io/projected/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-kube-api-access-5szfj\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.984948 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:25 crc kubenswrapper[4842]: I0311 19:19:25.985094 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.086445 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.086524 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.086613 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szfj\" (UniqueName: \"kubernetes.io/projected/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-kube-api-access-5szfj\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.090565 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.092036 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.103367 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szfj\" (UniqueName: \"kubernetes.io/projected/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-kube-api-access-5szfj\") pod \"nova-kuttl-cell0-conductor-db-sync-tk4f2\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.194418 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:26 crc kubenswrapper[4842]: I0311 19:19:26.647099 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2"] Mar 11 19:19:27 crc kubenswrapper[4842]: I0311 19:19:27.112610 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" event={"ID":"a7fe11e4-c139-4cab-bcc4-989b2e2fb979","Type":"ContainerStarted","Data":"ff3c344cd0c2091ce202287f80e01277abde41e1a6484ded7ad6494ae63a4c87"} Mar 11 19:19:27 crc kubenswrapper[4842]: I0311 19:19:27.113380 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" event={"ID":"a7fe11e4-c139-4cab-bcc4-989b2e2fb979","Type":"ContainerStarted","Data":"00d732d286a5ffc7932dee864e406504bb9ba47f97c549e3829752917b51ec2a"} Mar 11 19:19:27 crc kubenswrapper[4842]: I0311 19:19:27.148891 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" podStartSLOduration=2.148874486 podStartE2EDuration="2.148874486s" podCreationTimestamp="2026-03-11 19:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:27.143200064 +0000 UTC m=+1812.790896374" watchObservedRunningTime="2026-03-11 19:19:27.148874486 +0000 UTC m=+1812.796570786" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.062612 4842 scope.go:117] "RemoveContainer" containerID="92ce0de87c594880408b7f38d1af77724abee78631a9a283315bf5635ba42013" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.098325 4842 scope.go:117] "RemoveContainer" containerID="2a7b06faeff9601af31d2935eccc03b55a6eaface49a0619f30007b15a26f44f" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.142239 4842 scope.go:117] "RemoveContainer" containerID="f504992ca2617068de6efac7481db69261c7fffaadee27c157bdbae757b8ac09" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.173763 4842 scope.go:117] "RemoveContainer" containerID="f74d98a84106985b1b7e16462634e52b457f1df4056f407abbf4a1a02ea95911" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.230054 4842 scope.go:117] "RemoveContainer" containerID="fbd32d5d5fae101899de3c730b117bf79165112040f42ff40b9a7e49b838f4a8" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.274226 4842 scope.go:117] "RemoveContainer" containerID="3015bf7bfb6a123616bf2f72563ed9e438d15f01bb1d593fc982b7198d3eef88" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.290946 4842 scope.go:117] "RemoveContainer" containerID="7881df410e62ea15b3042b5252e8acf2583cc8f1829451ade594c50292032a1e" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.321315 4842 scope.go:117] "RemoveContainer" containerID="e9aef4079a359c758b236029cb262ec400935a6ce589d05b12396a6477b58011" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.340428 4842 scope.go:117] "RemoveContainer" containerID="c52fea5fcc683c704f3b76dd832e49d4fa54746978560143c46899473e49cf1f" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.359191 4842 scope.go:117] "RemoveContainer" containerID="dec1f6a7db47241e7d41782718dd7609b89a7e247c2ef60ab8512b11da23b8f8" Mar 11 19:19:29 crc kubenswrapper[4842]: I0311 19:19:29.374773 4842 scope.go:117] "RemoveContainer" containerID="31bd540836d1ecfb461194fee32946cce88c1caa154bb283417af56ab14b0b8e" Mar 11 19:19:31 crc kubenswrapper[4842]: I0311 19:19:31.048943 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/keystone-db-sync-g42cg"] Mar 11 19:19:31 crc kubenswrapper[4842]: I0311 19:19:31.056848 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/keystone-db-sync-g42cg"] Mar 11 19:19:32 crc kubenswrapper[4842]: I0311 19:19:32.173082 4842 generic.go:334] "Generic (PLEG): container finished" podID="a7fe11e4-c139-4cab-bcc4-989b2e2fb979" containerID="ff3c344cd0c2091ce202287f80e01277abde41e1a6484ded7ad6494ae63a4c87" exitCode=0 Mar 11 19:19:32 crc kubenswrapper[4842]: I0311 19:19:32.173130 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" event={"ID":"a7fe11e4-c139-4cab-bcc4-989b2e2fb979","Type":"ContainerDied","Data":"ff3c344cd0c2091ce202287f80e01277abde41e1a6484ded7ad6494ae63a4c87"} Mar 11 19:19:32 crc kubenswrapper[4842]: I0311 19:19:32.982573 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb" path="/var/lib/kubelet/pods/7c5c9d42-6509-4a4e-99c0-ac4b1e9e5fdb/volumes" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.619315 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.748154 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-config-data\") pod \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.748307 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-scripts\") pod \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.748412 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5szfj\" (UniqueName: \"kubernetes.io/projected/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-kube-api-access-5szfj\") pod \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\" (UID: \"a7fe11e4-c139-4cab-bcc4-989b2e2fb979\") " Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.759456 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-kube-api-access-5szfj" (OuterVolumeSpecName: "kube-api-access-5szfj") pod "a7fe11e4-c139-4cab-bcc4-989b2e2fb979" (UID: "a7fe11e4-c139-4cab-bcc4-989b2e2fb979"). InnerVolumeSpecName "kube-api-access-5szfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.761594 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-scripts" (OuterVolumeSpecName: "scripts") pod "a7fe11e4-c139-4cab-bcc4-989b2e2fb979" (UID: "a7fe11e4-c139-4cab-bcc4-989b2e2fb979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.783460 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-config-data" (OuterVolumeSpecName: "config-data") pod "a7fe11e4-c139-4cab-bcc4-989b2e2fb979" (UID: "a7fe11e4-c139-4cab-bcc4-989b2e2fb979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.850825 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.850870 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5szfj\" (UniqueName: \"kubernetes.io/projected/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-kube-api-access-5szfj\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.850886 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7fe11e4-c139-4cab-bcc4-989b2e2fb979-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:33 crc kubenswrapper[4842]: I0311 19:19:33.962094 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:19:33 crc kubenswrapper[4842]: E0311 19:19:33.962663 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.192996 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" event={"ID":"a7fe11e4-c139-4cab-bcc4-989b2e2fb979","Type":"ContainerDied","Data":"00d732d286a5ffc7932dee864e406504bb9ba47f97c549e3829752917b51ec2a"} Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.193047 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00d732d286a5ffc7932dee864e406504bb9ba47f97c549e3829752917b51ec2a" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.193108 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.280773 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:19:34 crc kubenswrapper[4842]: E0311 19:19:34.281370 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fe11e4-c139-4cab-bcc4-989b2e2fb979" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.281400 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fe11e4-c139-4cab-bcc4-989b2e2fb979" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.281681 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fe11e4-c139-4cab-bcc4-989b2e2fb979" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.282539 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.285025 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.285346 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-tr97s" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.303975 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.358490 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhk2s\" (UniqueName: \"kubernetes.io/projected/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-kube-api-access-dhk2s\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.358565 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.460500 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhk2s\" (UniqueName: \"kubernetes.io/projected/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-kube-api-access-dhk2s\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.460649 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.464576 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.478528 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhk2s\" (UniqueName: \"kubernetes.io/projected/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-kube-api-access-dhk2s\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:34 crc kubenswrapper[4842]: I0311 19:19:34.628428 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:35 crc kubenswrapper[4842]: I0311 19:19:35.080859 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:19:35 crc kubenswrapper[4842]: W0311 19:19:35.085249 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd067782b_b6da_4dcc_a0a9_0ecbdcfcf142.slice/crio-8de75d25ecd546f2b0f0f2db4b81ab6b1c7d0aab2a6cbd6cce92ddb3269141c2 WatchSource:0}: Error finding container 8de75d25ecd546f2b0f0f2db4b81ab6b1c7d0aab2a6cbd6cce92ddb3269141c2: Status 404 returned error can't find the container with id 8de75d25ecd546f2b0f0f2db4b81ab6b1c7d0aab2a6cbd6cce92ddb3269141c2 Mar 11 19:19:35 crc kubenswrapper[4842]: I0311 19:19:35.205737 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142","Type":"ContainerStarted","Data":"8de75d25ecd546f2b0f0f2db4b81ab6b1c7d0aab2a6cbd6cce92ddb3269141c2"} Mar 11 19:19:36 crc kubenswrapper[4842]: I0311 19:19:36.226453 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142","Type":"ContainerStarted","Data":"511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b"} Mar 11 19:19:36 crc kubenswrapper[4842]: I0311 19:19:36.251094 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=2.251074282 podStartE2EDuration="2.251074282s" podCreationTimestamp="2026-03-11 19:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:36.241901836 +0000 UTC m=+1821.889598146" watchObservedRunningTime="2026-03-11 19:19:36.251074282 +0000 UTC m=+1821.898770582" Mar 11 19:19:37 crc kubenswrapper[4842]: I0311 19:19:37.232778 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:40 crc kubenswrapper[4842]: I0311 19:19:40.047205 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/placement-db-sync-nk2fn"] Mar 11 19:19:40 crc kubenswrapper[4842]: I0311 19:19:40.060319 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/placement-db-sync-nk2fn"] Mar 11 19:19:40 crc kubenswrapper[4842]: I0311 19:19:40.974040 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8988f9d-f9a5-4611-b74f-12833fb5b143" path="/var/lib/kubelet/pods/b8988f9d-f9a5-4611-b74f-12833fb5b143/volumes" Mar 11 19:19:44 crc kubenswrapper[4842]: I0311 19:19:44.656158 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:19:44 crc kubenswrapper[4842]: I0311 19:19:44.966107 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:19:44 crc kubenswrapper[4842]: E0311 19:19:44.966908 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.031612 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-sdw9r"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.040016 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/keystone-bootstrap-sdw9r"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.223964 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.224968 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.227830 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-config-data" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.228211 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-scripts" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.244901 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.339627 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-config-data\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.339736 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fw4g\" (UniqueName: \"kubernetes.io/projected/c29d634a-779e-4248-b485-a586c702d4b4-kube-api-access-6fw4g\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.339767 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-scripts\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.441011 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fw4g\" (UniqueName: \"kubernetes.io/projected/c29d634a-779e-4248-b485-a586c702d4b4-kube-api-access-6fw4g\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.441306 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-scripts\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.441436 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-config-data\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.447618 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.447788 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-scripts\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.447904 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-config-data\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.457081 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.463117 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.472681 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.493955 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fw4g\" (UniqueName: \"kubernetes.io/projected/c29d634a-779e-4248-b485-a586c702d4b4-kube-api-access-6fw4g\") pod \"nova-kuttl-cell0-cell-mapping-jcpbg\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.527913 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.528851 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.535065 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.535473 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-novncproxy-config-data" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.541401 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.544342 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b06c-578b-402e-893c-d5bc91514b96-config-data\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.544385 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqnkl\" (UniqueName: \"kubernetes.io/projected/7988b06c-578b-402e-893c-d5bc91514b96-kube-api-access-bqnkl\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.544432 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b06c-578b-402e-893c-d5bc91514b96-logs\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.594490 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.595711 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.605229 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.613310 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.655121 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75k2h\" (UniqueName: \"kubernetes.io/projected/7836e47f-43be-4263-a0e8-0a3e209ce400-kube-api-access-75k2h\") pod \"nova-kuttl-scheduler-0\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.655164 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kksv4\" (UniqueName: \"kubernetes.io/projected/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-kube-api-access-kksv4\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.655198 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b06c-578b-402e-893c-d5bc91514b96-logs\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.655237 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7836e47f-43be-4263-a0e8-0a3e209ce400-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.656172 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.656236 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b06c-578b-402e-893c-d5bc91514b96-config-data\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.656279 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqnkl\" (UniqueName: \"kubernetes.io/projected/7988b06c-578b-402e-893c-d5bc91514b96-kube-api-access-bqnkl\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.656913 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b06c-578b-402e-893c-d5bc91514b96-logs\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.677866 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b06c-578b-402e-893c-d5bc91514b96-config-data\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.679288 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqnkl\" (UniqueName: \"kubernetes.io/projected/7988b06c-578b-402e-893c-d5bc91514b96-kube-api-access-bqnkl\") pod \"nova-kuttl-api-0\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.693548 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.695585 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.701649 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.723737 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.762096 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61595497-681e-46b6-8625-7488cb61e157-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.762139 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzmqb\" (UniqueName: \"kubernetes.io/projected/61595497-681e-46b6-8625-7488cb61e157-kube-api-access-wzmqb\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.762178 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7836e47f-43be-4263-a0e8-0a3e209ce400-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.762260 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.762300 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61595497-681e-46b6-8625-7488cb61e157-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.762342 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75k2h\" (UniqueName: \"kubernetes.io/projected/7836e47f-43be-4263-a0e8-0a3e209ce400-kube-api-access-75k2h\") pod \"nova-kuttl-scheduler-0\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.762361 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kksv4\" (UniqueName: \"kubernetes.io/projected/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-kube-api-access-kksv4\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.775158 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7836e47f-43be-4263-a0e8-0a3e209ce400-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.776283 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.780000 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kksv4\" (UniqueName: \"kubernetes.io/projected/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-kube-api-access-kksv4\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.785443 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75k2h\" (UniqueName: \"kubernetes.io/projected/7836e47f-43be-4263-a0e8-0a3e209ce400-kube-api-access-75k2h\") pod \"nova-kuttl-scheduler-0\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.864185 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61595497-681e-46b6-8625-7488cb61e157-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.864303 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61595497-681e-46b6-8625-7488cb61e157-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.864344 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzmqb\" (UniqueName: \"kubernetes.io/projected/61595497-681e-46b6-8625-7488cb61e157-kube-api-access-wzmqb\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.865078 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61595497-681e-46b6-8625-7488cb61e157-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.870698 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61595497-681e-46b6-8625-7488cb61e157-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.877434 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.882004 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzmqb\" (UniqueName: \"kubernetes.io/projected/61595497-681e-46b6-8625-7488cb61e157-kube-api-access-wzmqb\") pod \"nova-kuttl-metadata-0\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.976893 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:45 crc kubenswrapper[4842]: I0311 19:19:45.991254 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.019354 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.092330 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg"] Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.241778 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz"] Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.244193 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.251609 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.251792 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-scripts" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.257585 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz"] Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.317337 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" event={"ID":"c29d634a-779e-4248-b485-a586c702d4b4","Type":"ContainerStarted","Data":"00df56ef99efcef025e43d4a4421124718ae32918dcc854766eb18bb965154e4"} Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.317392 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" event={"ID":"c29d634a-779e-4248-b485-a586c702d4b4","Type":"ContainerStarted","Data":"40a8975479615d09734913c70b52250b380c5bdf547dbab78a0ae3952a095ad2"} Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.340042 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" podStartSLOduration=1.340023937 podStartE2EDuration="1.340023937s" podCreationTimestamp="2026-03-11 19:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:46.330115212 +0000 UTC m=+1831.977811492" watchObservedRunningTime="2026-03-11 19:19:46.340023937 +0000 UTC m=+1831.987720217" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.350588 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.372712 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvt94\" (UniqueName: \"kubernetes.io/projected/190ff3a6-1b86-490e-9e67-324a02a35b55-kube-api-access-zvt94\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.372866 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.372928 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.474160 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.474201 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.474302 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvt94\" (UniqueName: \"kubernetes.io/projected/190ff3a6-1b86-490e-9e67-324a02a35b55-kube-api-access-zvt94\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.489155 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.489358 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.496180 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvt94\" (UniqueName: \"kubernetes.io/projected/190ff3a6-1b86-490e-9e67-324a02a35b55-kube-api-access-zvt94\") pod \"nova-kuttl-cell1-conductor-db-sync-gmxxz\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.508705 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.580719 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.595555 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.607605 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:46 crc kubenswrapper[4842]: I0311 19:19:46.973161 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edfaec1-653a-4bd8-b8b1-13180eefe66b" path="/var/lib/kubelet/pods/6edfaec1-653a-4bd8-b8b1-13180eefe66b/volumes" Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.055246 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz"] Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.325701 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a","Type":"ContainerStarted","Data":"93812d34795315ed6f64e2f14a3507fe676d9f9021270ba164b62bc3152489a6"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.325756 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a","Type":"ContainerStarted","Data":"1df350bec6b2985a864cda0bbcd1ab2f72fe7d59973b1644dc869bd95fd21620"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.327309 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" event={"ID":"190ff3a6-1b86-490e-9e67-324a02a35b55","Type":"ContainerStarted","Data":"f1ed2a7d9055695e82596670612f7f30bdc582859969e06d8af1b308c701eb42"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.327342 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" event={"ID":"190ff3a6-1b86-490e-9e67-324a02a35b55","Type":"ContainerStarted","Data":"82a6a77929678326a3c4a521a53be7234e10ceb8808e5eaf993f33dc8f15da00"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.331517 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7836e47f-43be-4263-a0e8-0a3e209ce400","Type":"ContainerStarted","Data":"bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.331703 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7836e47f-43be-4263-a0e8-0a3e209ce400","Type":"ContainerStarted","Data":"4e98fbfbe6ecf491aca3df93aec60ee08c7096aa94d844e32658a1442732dfbf"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.334547 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"61595497-681e-46b6-8625-7488cb61e157","Type":"ContainerStarted","Data":"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.334994 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"61595497-681e-46b6-8625-7488cb61e157","Type":"ContainerStarted","Data":"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.335088 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"61595497-681e-46b6-8625-7488cb61e157","Type":"ContainerStarted","Data":"ab8a4f9ee7d190a545e75e4dfe47ef30daffa3f92d19baf70f60e67cd3950522"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.336642 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7988b06c-578b-402e-893c-d5bc91514b96","Type":"ContainerStarted","Data":"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.336684 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7988b06c-578b-402e-893c-d5bc91514b96","Type":"ContainerStarted","Data":"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.336694 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7988b06c-578b-402e-893c-d5bc91514b96","Type":"ContainerStarted","Data":"0dfc9ed8be72f25be48caf2554ee0291ed5a93e39a264dd4e093f05615a48e55"} Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.344926 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podStartSLOduration=2.344904262 podStartE2EDuration="2.344904262s" podCreationTimestamp="2026-03-11 19:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:47.340507585 +0000 UTC m=+1832.988203875" watchObservedRunningTime="2026-03-11 19:19:47.344904262 +0000 UTC m=+1832.992600552" Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.362129 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.362114424 podStartE2EDuration="2.362114424s" podCreationTimestamp="2026-03-11 19:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:47.360637824 +0000 UTC m=+1833.008334144" watchObservedRunningTime="2026-03-11 19:19:47.362114424 +0000 UTC m=+1833.009810694" Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.382102 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" podStartSLOduration=1.382086519 podStartE2EDuration="1.382086519s" podCreationTimestamp="2026-03-11 19:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:47.376513749 +0000 UTC m=+1833.024210029" watchObservedRunningTime="2026-03-11 19:19:47.382086519 +0000 UTC m=+1833.029782799" Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.394900 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.394883142 podStartE2EDuration="2.394883142s" podCreationTimestamp="2026-03-11 19:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:47.392747834 +0000 UTC m=+1833.040444114" watchObservedRunningTime="2026-03-11 19:19:47.394883142 +0000 UTC m=+1833.042579422" Mar 11 19:19:47 crc kubenswrapper[4842]: I0311 19:19:47.412851 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.412832073 podStartE2EDuration="2.412832073s" podCreationTimestamp="2026-03-11 19:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:47.406816791 +0000 UTC m=+1833.054513071" watchObservedRunningTime="2026-03-11 19:19:47.412832073 +0000 UTC m=+1833.060528353" Mar 11 19:19:50 crc kubenswrapper[4842]: I0311 19:19:50.362314 4842 generic.go:334] "Generic (PLEG): container finished" podID="190ff3a6-1b86-490e-9e67-324a02a35b55" containerID="f1ed2a7d9055695e82596670612f7f30bdc582859969e06d8af1b308c701eb42" exitCode=0 Mar 11 19:19:50 crc kubenswrapper[4842]: I0311 19:19:50.362405 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" event={"ID":"190ff3a6-1b86-490e-9e67-324a02a35b55","Type":"ContainerDied","Data":"f1ed2a7d9055695e82596670612f7f30bdc582859969e06d8af1b308c701eb42"} Mar 11 19:19:50 crc kubenswrapper[4842]: I0311 19:19:50.979304 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:50 crc kubenswrapper[4842]: I0311 19:19:50.992101 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.378331 4842 generic.go:334] "Generic (PLEG): container finished" podID="c29d634a-779e-4248-b485-a586c702d4b4" containerID="00df56ef99efcef025e43d4a4421124718ae32918dcc854766eb18bb965154e4" exitCode=0 Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.378415 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" event={"ID":"c29d634a-779e-4248-b485-a586c702d4b4","Type":"ContainerDied","Data":"00df56ef99efcef025e43d4a4421124718ae32918dcc854766eb18bb965154e4"} Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.838848 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.867745 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-config-data\") pod \"190ff3a6-1b86-490e-9e67-324a02a35b55\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.867906 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvt94\" (UniqueName: \"kubernetes.io/projected/190ff3a6-1b86-490e-9e67-324a02a35b55-kube-api-access-zvt94\") pod \"190ff3a6-1b86-490e-9e67-324a02a35b55\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.867965 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-scripts\") pod \"190ff3a6-1b86-490e-9e67-324a02a35b55\" (UID: \"190ff3a6-1b86-490e-9e67-324a02a35b55\") " Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.872923 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-scripts" (OuterVolumeSpecName: "scripts") pod "190ff3a6-1b86-490e-9e67-324a02a35b55" (UID: "190ff3a6-1b86-490e-9e67-324a02a35b55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.873093 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190ff3a6-1b86-490e-9e67-324a02a35b55-kube-api-access-zvt94" (OuterVolumeSpecName: "kube-api-access-zvt94") pod "190ff3a6-1b86-490e-9e67-324a02a35b55" (UID: "190ff3a6-1b86-490e-9e67-324a02a35b55"). InnerVolumeSpecName "kube-api-access-zvt94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.889449 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-config-data" (OuterVolumeSpecName: "config-data") pod "190ff3a6-1b86-490e-9e67-324a02a35b55" (UID: "190ff3a6-1b86-490e-9e67-324a02a35b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.969429 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.969457 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvt94\" (UniqueName: \"kubernetes.io/projected/190ff3a6-1b86-490e-9e67-324a02a35b55-kube-api-access-zvt94\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:51 crc kubenswrapper[4842]: I0311 19:19:51.969467 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/190ff3a6-1b86-490e-9e67-324a02a35b55-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.387947 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" event={"ID":"190ff3a6-1b86-490e-9e67-324a02a35b55","Type":"ContainerDied","Data":"82a6a77929678326a3c4a521a53be7234e10ceb8808e5eaf993f33dc8f15da00"} Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.387983 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82a6a77929678326a3c4a521a53be7234e10ceb8808e5eaf993f33dc8f15da00" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.389462 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.463399 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:19:52 crc kubenswrapper[4842]: E0311 19:19:52.463731 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190ff3a6-1b86-490e-9e67-324a02a35b55" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.463747 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="190ff3a6-1b86-490e-9e67-324a02a35b55" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.463896 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="190ff3a6-1b86-490e-9e67-324a02a35b55" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.464448 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.471559 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.479259 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.578961 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.579364 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq28m\" (UniqueName: \"kubernetes.io/projected/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-kube-api-access-sq28m\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.681374 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq28m\" (UniqueName: \"kubernetes.io/projected/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-kube-api-access-sq28m\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.681481 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.686523 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.701875 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq28m\" (UniqueName: \"kubernetes.io/projected/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-kube-api-access-sq28m\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.731406 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.782413 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-config-data\") pod \"c29d634a-779e-4248-b485-a586c702d4b4\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.782477 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fw4g\" (UniqueName: \"kubernetes.io/projected/c29d634a-779e-4248-b485-a586c702d4b4-kube-api-access-6fw4g\") pod \"c29d634a-779e-4248-b485-a586c702d4b4\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.782547 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-scripts\") pod \"c29d634a-779e-4248-b485-a586c702d4b4\" (UID: \"c29d634a-779e-4248-b485-a586c702d4b4\") " Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.793394 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-scripts" (OuterVolumeSpecName: "scripts") pod "c29d634a-779e-4248-b485-a586c702d4b4" (UID: "c29d634a-779e-4248-b485-a586c702d4b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.793430 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29d634a-779e-4248-b485-a586c702d4b4-kube-api-access-6fw4g" (OuterVolumeSpecName: "kube-api-access-6fw4g") pod "c29d634a-779e-4248-b485-a586c702d4b4" (UID: "c29d634a-779e-4248-b485-a586c702d4b4"). InnerVolumeSpecName "kube-api-access-6fw4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.793842 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.814204 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-config-data" (OuterVolumeSpecName: "config-data") pod "c29d634a-779e-4248-b485-a586c702d4b4" (UID: "c29d634a-779e-4248-b485-a586c702d4b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.884370 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.884397 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fw4g\" (UniqueName: \"kubernetes.io/projected/c29d634a-779e-4248-b485-a586c702d4b4-kube-api-access-6fw4g\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:52 crc kubenswrapper[4842]: I0311 19:19:52.884407 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29d634a-779e-4248-b485-a586c702d4b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.229124 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:19:53 crc kubenswrapper[4842]: W0311 19:19:53.236348 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode62ac3a6_d61b_47b0_a91f_0772398f3ddc.slice/crio-a40a0f4ff475c2a5e381e52c5cbb8602064ed87d25ffc8bfdb58b64f5441ee92 WatchSource:0}: Error finding container a40a0f4ff475c2a5e381e52c5cbb8602064ed87d25ffc8bfdb58b64f5441ee92: Status 404 returned error can't find the container with id a40a0f4ff475c2a5e381e52c5cbb8602064ed87d25ffc8bfdb58b64f5441ee92 Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.397169 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"e62ac3a6-d61b-47b0-a91f-0772398f3ddc","Type":"ContainerStarted","Data":"a40a0f4ff475c2a5e381e52c5cbb8602064ed87d25ffc8bfdb58b64f5441ee92"} Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.399060 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" event={"ID":"c29d634a-779e-4248-b485-a586c702d4b4","Type":"ContainerDied","Data":"40a8975479615d09734913c70b52250b380c5bdf547dbab78a0ae3952a095ad2"} Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.399098 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40a8975479615d09734913c70b52250b380c5bdf547dbab78a0ae3952a095ad2" Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.399124 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg" Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.593829 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.594504 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-log" containerID="cri-o://717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02" gracePeriod=30 Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.594649 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-api" containerID="cri-o://29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b" gracePeriod=30 Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.620665 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.620862 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="7836e47f-43be-4263-a0e8-0a3e209ce400" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8" gracePeriod=30 Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.660702 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.660900 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-log" containerID="cri-o://95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990" gracePeriod=30 Mar 11 19:19:53 crc kubenswrapper[4842]: I0311 19:19:53.661027 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd" gracePeriod=30 Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.154495 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.204794 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b06c-578b-402e-893c-d5bc91514b96-config-data\") pod \"7988b06c-578b-402e-893c-d5bc91514b96\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.204843 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqnkl\" (UniqueName: \"kubernetes.io/projected/7988b06c-578b-402e-893c-d5bc91514b96-kube-api-access-bqnkl\") pod \"7988b06c-578b-402e-893c-d5bc91514b96\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.204885 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b06c-578b-402e-893c-d5bc91514b96-logs\") pod \"7988b06c-578b-402e-893c-d5bc91514b96\" (UID: \"7988b06c-578b-402e-893c-d5bc91514b96\") " Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.205562 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7988b06c-578b-402e-893c-d5bc91514b96-logs" (OuterVolumeSpecName: "logs") pod "7988b06c-578b-402e-893c-d5bc91514b96" (UID: "7988b06c-578b-402e-893c-d5bc91514b96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.210421 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7988b06c-578b-402e-893c-d5bc91514b96-kube-api-access-bqnkl" (OuterVolumeSpecName: "kube-api-access-bqnkl") pod "7988b06c-578b-402e-893c-d5bc91514b96" (UID: "7988b06c-578b-402e-893c-d5bc91514b96"). InnerVolumeSpecName "kube-api-access-bqnkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.227875 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7988b06c-578b-402e-893c-d5bc91514b96-config-data" (OuterVolumeSpecName: "config-data") pod "7988b06c-578b-402e-893c-d5bc91514b96" (UID: "7988b06c-578b-402e-893c-d5bc91514b96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.250044 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.306592 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzmqb\" (UniqueName: \"kubernetes.io/projected/61595497-681e-46b6-8625-7488cb61e157-kube-api-access-wzmqb\") pod \"61595497-681e-46b6-8625-7488cb61e157\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.306855 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61595497-681e-46b6-8625-7488cb61e157-logs\") pod \"61595497-681e-46b6-8625-7488cb61e157\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.306926 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61595497-681e-46b6-8625-7488cb61e157-config-data\") pod \"61595497-681e-46b6-8625-7488cb61e157\" (UID: \"61595497-681e-46b6-8625-7488cb61e157\") " Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.307263 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61595497-681e-46b6-8625-7488cb61e157-logs" (OuterVolumeSpecName: "logs") pod "61595497-681e-46b6-8625-7488cb61e157" (UID: "61595497-681e-46b6-8625-7488cb61e157"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.307905 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7988b06c-578b-402e-893c-d5bc91514b96-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.307926 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqnkl\" (UniqueName: \"kubernetes.io/projected/7988b06c-578b-402e-893c-d5bc91514b96-kube-api-access-bqnkl\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.307938 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7988b06c-578b-402e-893c-d5bc91514b96-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.307966 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61595497-681e-46b6-8625-7488cb61e157-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.310320 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61595497-681e-46b6-8625-7488cb61e157-kube-api-access-wzmqb" (OuterVolumeSpecName: "kube-api-access-wzmqb") pod "61595497-681e-46b6-8625-7488cb61e157" (UID: "61595497-681e-46b6-8625-7488cb61e157"). InnerVolumeSpecName "kube-api-access-wzmqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.328599 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61595497-681e-46b6-8625-7488cb61e157-config-data" (OuterVolumeSpecName: "config-data") pod "61595497-681e-46b6-8625-7488cb61e157" (UID: "61595497-681e-46b6-8625-7488cb61e157"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408047 4842 generic.go:334] "Generic (PLEG): container finished" podID="61595497-681e-46b6-8625-7488cb61e157" containerID="954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd" exitCode=0 Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408077 4842 generic.go:334] "Generic (PLEG): container finished" podID="61595497-681e-46b6-8625-7488cb61e157" containerID="95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990" exitCode=143 Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408098 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408139 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"61595497-681e-46b6-8625-7488cb61e157","Type":"ContainerDied","Data":"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd"} Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408200 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"61595497-681e-46b6-8625-7488cb61e157","Type":"ContainerDied","Data":"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990"} Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408212 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"61595497-681e-46b6-8625-7488cb61e157","Type":"ContainerDied","Data":"ab8a4f9ee7d190a545e75e4dfe47ef30daffa3f92d19baf70f60e67cd3950522"} Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408259 4842 scope.go:117] "RemoveContainer" containerID="954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408851 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzmqb\" (UniqueName: \"kubernetes.io/projected/61595497-681e-46b6-8625-7488cb61e157-kube-api-access-wzmqb\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.408878 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61595497-681e-46b6-8625-7488cb61e157-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.410539 4842 generic.go:334] "Generic (PLEG): container finished" podID="7988b06c-578b-402e-893c-d5bc91514b96" containerID="29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b" exitCode=0 Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.410569 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.410574 4842 generic.go:334] "Generic (PLEG): container finished" podID="7988b06c-578b-402e-893c-d5bc91514b96" containerID="717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02" exitCode=143 Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.410575 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7988b06c-578b-402e-893c-d5bc91514b96","Type":"ContainerDied","Data":"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b"} Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.410600 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7988b06c-578b-402e-893c-d5bc91514b96","Type":"ContainerDied","Data":"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02"} Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.410610 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"7988b06c-578b-402e-893c-d5bc91514b96","Type":"ContainerDied","Data":"0dfc9ed8be72f25be48caf2554ee0291ed5a93e39a264dd4e093f05615a48e55"} Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.415971 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"e62ac3a6-d61b-47b0-a91f-0772398f3ddc","Type":"ContainerStarted","Data":"18240a5aba4e417e66b11f8e254e8b82251caf43c2f4b8a02303a313fe3c2828"} Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.416179 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.445292 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=2.445257657 podStartE2EDuration="2.445257657s" podCreationTimestamp="2026-03-11 19:19:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:54.441517677 +0000 UTC m=+1840.089213957" watchObservedRunningTime="2026-03-11 19:19:54.445257657 +0000 UTC m=+1840.092953937" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.448496 4842 scope.go:117] "RemoveContainer" containerID="95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.474544 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.484316 4842 scope.go:117] "RemoveContainer" containerID="954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd" Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.489443 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd\": container with ID starting with 954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd not found: ID does not exist" containerID="954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.489492 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd"} err="failed to get container status \"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd\": rpc error: code = NotFound desc = could not find container \"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd\": container with ID starting with 954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.489519 4842 scope.go:117] "RemoveContainer" containerID="95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.494077 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.498497 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990\": container with ID starting with 95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990 not found: ID does not exist" containerID="95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.498536 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990"} err="failed to get container status \"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990\": rpc error: code = NotFound desc = could not find container \"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990\": container with ID starting with 95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990 not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.498563 4842 scope.go:117] "RemoveContainer" containerID="954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.514445 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd"} err="failed to get container status \"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd\": rpc error: code = NotFound desc = could not find container \"954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd\": container with ID starting with 954ac51b85e9c37319f7c0ec20a5902089bd9bf295ac6840ca3b4cc8caef30bd not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.514492 4842 scope.go:117] "RemoveContainer" containerID="95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.518322 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.518789 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990"} err="failed to get container status \"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990\": rpc error: code = NotFound desc = could not find container \"95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990\": container with ID starting with 95caa767565c9e9520e0d6b478e40bd152f35267c85c6409aed3500de5785990 not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.518835 4842 scope.go:117] "RemoveContainer" containerID="29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.533326 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564311 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.564669 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-api" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564687 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-api" Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.564697 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564706 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.564721 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-log" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564729 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-log" Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.564753 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29d634a-779e-4248-b485-a586c702d4b4" containerName="nova-manage" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564759 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29d634a-779e-4248-b485-a586c702d4b4" containerName="nova-manage" Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.564771 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-log" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564777 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-log" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564923 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-log" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564935 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-log" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564947 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7988b06c-578b-402e-893c-d5bc91514b96" containerName="nova-kuttl-api-api" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564954 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="61595497-681e-46b6-8625-7488cb61e157" containerName="nova-kuttl-metadata-metadata" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.564962 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29d634a-779e-4248-b485-a586c702d4b4" containerName="nova-manage" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.565751 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.568587 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.602796 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.604183 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.606162 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.614261 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36de4322-50a5-4af7-9eac-8235c9bd9c6b-config-data\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.614399 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36de4322-50a5-4af7-9eac-8235c9bd9c6b-logs\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.614455 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dghtv\" (UniqueName: \"kubernetes.io/projected/36de4322-50a5-4af7-9eac-8235c9bd9c6b-kube-api-access-dghtv\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.618675 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.626693 4842 scope.go:117] "RemoveContainer" containerID="717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.638844 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.644770 4842 scope.go:117] "RemoveContainer" containerID="29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b" Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.645693 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b\": container with ID starting with 29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b not found: ID does not exist" containerID="29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.645796 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b"} err="failed to get container status \"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b\": rpc error: code = NotFound desc = could not find container \"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b\": container with ID starting with 29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.645878 4842 scope.go:117] "RemoveContainer" containerID="717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02" Mar 11 19:19:54 crc kubenswrapper[4842]: E0311 19:19:54.646284 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02\": container with ID starting with 717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02 not found: ID does not exist" containerID="717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.646315 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02"} err="failed to get container status \"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02\": rpc error: code = NotFound desc = could not find container \"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02\": container with ID starting with 717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02 not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.646338 4842 scope.go:117] "RemoveContainer" containerID="29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.646644 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b"} err="failed to get container status \"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b\": rpc error: code = NotFound desc = could not find container \"29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b\": container with ID starting with 29b3973ebc1ebe88406a9bac172c3e3e24dff461d11afb15454ff229d295104b not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.646683 4842 scope.go:117] "RemoveContainer" containerID="717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.647056 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02"} err="failed to get container status \"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02\": rpc error: code = NotFound desc = could not find container \"717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02\": container with ID starting with 717a28bde9433a1f1667da8a8851f1152848447eff6f2ff1280946020966ee02 not found: ID does not exist" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.715509 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60da412e-bff9-4ad8-838d-3e7c08b814a2-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.715987 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36de4322-50a5-4af7-9eac-8235c9bd9c6b-logs\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.716021 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60da412e-bff9-4ad8-838d-3e7c08b814a2-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.716080 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dghtv\" (UniqueName: \"kubernetes.io/projected/36de4322-50a5-4af7-9eac-8235c9bd9c6b-kube-api-access-dghtv\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.716108 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36de4322-50a5-4af7-9eac-8235c9bd9c6b-config-data\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.716147 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfns4\" (UniqueName: \"kubernetes.io/projected/60da412e-bff9-4ad8-838d-3e7c08b814a2-kube-api-access-bfns4\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.716652 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36de4322-50a5-4af7-9eac-8235c9bd9c6b-logs\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.721092 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36de4322-50a5-4af7-9eac-8235c9bd9c6b-config-data\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.734238 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dghtv\" (UniqueName: \"kubernetes.io/projected/36de4322-50a5-4af7-9eac-8235c9bd9c6b-kube-api-access-dghtv\") pod \"nova-kuttl-api-0\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.817499 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfns4\" (UniqueName: \"kubernetes.io/projected/60da412e-bff9-4ad8-838d-3e7c08b814a2-kube-api-access-bfns4\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.817601 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60da412e-bff9-4ad8-838d-3e7c08b814a2-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.817686 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60da412e-bff9-4ad8-838d-3e7c08b814a2-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.818246 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60da412e-bff9-4ad8-838d-3e7c08b814a2-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.820829 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60da412e-bff9-4ad8-838d-3e7c08b814a2-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.832174 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfns4\" (UniqueName: \"kubernetes.io/projected/60da412e-bff9-4ad8-838d-3e7c08b814a2-kube-api-access-bfns4\") pod \"nova-kuttl-metadata-0\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.916455 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.927603 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.985833 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61595497-681e-46b6-8625-7488cb61e157" path="/var/lib/kubelet/pods/61595497-681e-46b6-8625-7488cb61e157/volumes" Mar 11 19:19:54 crc kubenswrapper[4842]: I0311 19:19:54.987190 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7988b06c-578b-402e-893c-d5bc91514b96" path="/var/lib/kubelet/pods/7988b06c-578b-402e-893c-d5bc91514b96/volumes" Mar 11 19:19:55 crc kubenswrapper[4842]: I0311 19:19:55.417392 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:19:55 crc kubenswrapper[4842]: I0311 19:19:55.433937 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"60da412e-bff9-4ad8-838d-3e7c08b814a2","Type":"ContainerStarted","Data":"090d772b18ef06a1b374c1dcbe6b8e1d93a096a15b7e6fec840b6d18ba8545f8"} Mar 11 19:19:55 crc kubenswrapper[4842]: I0311 19:19:55.486603 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:19:55 crc kubenswrapper[4842]: I0311 19:19:55.977195 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:55 crc kubenswrapper[4842]: I0311 19:19:55.994853 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.445677 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"36de4322-50a5-4af7-9eac-8235c9bd9c6b","Type":"ContainerStarted","Data":"102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74"} Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.446051 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"36de4322-50a5-4af7-9eac-8235c9bd9c6b","Type":"ContainerStarted","Data":"7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af"} Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.446076 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"36de4322-50a5-4af7-9eac-8235c9bd9c6b","Type":"ContainerStarted","Data":"268c697e73829f633874a909b318abfcec3a8d83dd7d1a30ca7e2dbc1a5889e6"} Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.448767 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"60da412e-bff9-4ad8-838d-3e7c08b814a2","Type":"ContainerStarted","Data":"a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646"} Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.448801 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"60da412e-bff9-4ad8-838d-3e7c08b814a2","Type":"ContainerStarted","Data":"a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240"} Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.459054 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.464899 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.464874953 podStartE2EDuration="2.464874953s" podCreationTimestamp="2026-03-11 19:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:56.463822925 +0000 UTC m=+1842.111519245" watchObservedRunningTime="2026-03-11 19:19:56.464874953 +0000 UTC m=+1842.112571283" Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.515396 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.515366106 podStartE2EDuration="2.515366106s" podCreationTimestamp="2026-03-11 19:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:19:56.487651044 +0000 UTC m=+1842.135347344" watchObservedRunningTime="2026-03-11 19:19:56.515366106 +0000 UTC m=+1842.163062396" Mar 11 19:19:56 crc kubenswrapper[4842]: I0311 19:19:56.962831 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:19:56 crc kubenswrapper[4842]: E0311 19:19:56.963355 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.155879 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.273921 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7836e47f-43be-4263-a0e8-0a3e209ce400-config-data\") pod \"7836e47f-43be-4263-a0e8-0a3e209ce400\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.274101 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75k2h\" (UniqueName: \"kubernetes.io/projected/7836e47f-43be-4263-a0e8-0a3e209ce400-kube-api-access-75k2h\") pod \"7836e47f-43be-4263-a0e8-0a3e209ce400\" (UID: \"7836e47f-43be-4263-a0e8-0a3e209ce400\") " Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.279428 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7836e47f-43be-4263-a0e8-0a3e209ce400-kube-api-access-75k2h" (OuterVolumeSpecName: "kube-api-access-75k2h") pod "7836e47f-43be-4263-a0e8-0a3e209ce400" (UID: "7836e47f-43be-4263-a0e8-0a3e209ce400"). InnerVolumeSpecName "kube-api-access-75k2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.299463 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7836e47f-43be-4263-a0e8-0a3e209ce400-config-data" (OuterVolumeSpecName: "config-data") pod "7836e47f-43be-4263-a0e8-0a3e209ce400" (UID: "7836e47f-43be-4263-a0e8-0a3e209ce400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.375901 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75k2h\" (UniqueName: \"kubernetes.io/projected/7836e47f-43be-4263-a0e8-0a3e209ce400-kube-api-access-75k2h\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.376200 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7836e47f-43be-4263-a0e8-0a3e209ce400-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.469793 4842 generic.go:334] "Generic (PLEG): container finished" podID="7836e47f-43be-4263-a0e8-0a3e209ce400" containerID="bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8" exitCode=0 Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.469831 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7836e47f-43be-4263-a0e8-0a3e209ce400","Type":"ContainerDied","Data":"bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8"} Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.469860 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"7836e47f-43be-4263-a0e8-0a3e209ce400","Type":"ContainerDied","Data":"4e98fbfbe6ecf491aca3df93aec60ee08c7096aa94d844e32658a1442732dfbf"} Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.469874 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.469893 4842 scope.go:117] "RemoveContainer" containerID="bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.488971 4842 scope.go:117] "RemoveContainer" containerID="bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8" Mar 11 19:19:58 crc kubenswrapper[4842]: E0311 19:19:58.489456 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8\": container with ID starting with bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8 not found: ID does not exist" containerID="bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.489510 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8"} err="failed to get container status \"bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8\": rpc error: code = NotFound desc = could not find container \"bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8\": container with ID starting with bb5a93092270b9fd55c37de9b89c0c74c93205c34252c9c538cb51175cf462b8 not found: ID does not exist" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.519698 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.536242 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.549542 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:58 crc kubenswrapper[4842]: E0311 19:19:58.549930 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7836e47f-43be-4263-a0e8-0a3e209ce400" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.549944 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7836e47f-43be-4263-a0e8-0a3e209ce400" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.550210 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7836e47f-43be-4263-a0e8-0a3e209ce400" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.550861 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.554487 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.575329 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.579388 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2171d9-c161-440f-9401-65787d654d6c-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.579442 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twzh4\" (UniqueName: \"kubernetes.io/projected/4d2171d9-c161-440f-9401-65787d654d6c-kube-api-access-twzh4\") pod \"nova-kuttl-scheduler-0\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.681511 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2171d9-c161-440f-9401-65787d654d6c-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.681564 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twzh4\" (UniqueName: \"kubernetes.io/projected/4d2171d9-c161-440f-9401-65787d654d6c-kube-api-access-twzh4\") pod \"nova-kuttl-scheduler-0\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.685904 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2171d9-c161-440f-9401-65787d654d6c-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.707605 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twzh4\" (UniqueName: \"kubernetes.io/projected/4d2171d9-c161-440f-9401-65787d654d6c-kube-api-access-twzh4\") pod \"nova-kuttl-scheduler-0\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.903596 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:19:58 crc kubenswrapper[4842]: I0311 19:19:58.978026 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7836e47f-43be-4263-a0e8-0a3e209ce400" path="/var/lib/kubelet/pods/7836e47f-43be-4263-a0e8-0a3e209ce400/volumes" Mar 11 19:19:59 crc kubenswrapper[4842]: I0311 19:19:59.369394 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:19:59 crc kubenswrapper[4842]: W0311 19:19:59.377430 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d2171d9_c161_440f_9401_65787d654d6c.slice/crio-fc80b7f8c35863cc9e1f5a20eae544d32cf4aeec559eba9107d530ce4ed5bfae WatchSource:0}: Error finding container fc80b7f8c35863cc9e1f5a20eae544d32cf4aeec559eba9107d530ce4ed5bfae: Status 404 returned error can't find the container with id fc80b7f8c35863cc9e1f5a20eae544d32cf4aeec559eba9107d530ce4ed5bfae Mar 11 19:19:59 crc kubenswrapper[4842]: I0311 19:19:59.524550 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"4d2171d9-c161-440f-9401-65787d654d6c","Type":"ContainerStarted","Data":"fc80b7f8c35863cc9e1f5a20eae544d32cf4aeec559eba9107d530ce4ed5bfae"} Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.130121 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554280-75pf5"] Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.137448 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554280-75pf5" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.139640 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554280-75pf5"] Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.141209 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.141342 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.141419 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.209479 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2cz7\" (UniqueName: \"kubernetes.io/projected/a5d5da09-a866-4784-9c0d-914384019453-kube-api-access-f2cz7\") pod \"auto-csr-approver-29554280-75pf5\" (UID: \"a5d5da09-a866-4784-9c0d-914384019453\") " pod="openshift-infra/auto-csr-approver-29554280-75pf5" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.311493 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2cz7\" (UniqueName: \"kubernetes.io/projected/a5d5da09-a866-4784-9c0d-914384019453-kube-api-access-f2cz7\") pod \"auto-csr-approver-29554280-75pf5\" (UID: \"a5d5da09-a866-4784-9c0d-914384019453\") " pod="openshift-infra/auto-csr-approver-29554280-75pf5" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.341079 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2cz7\" (UniqueName: \"kubernetes.io/projected/a5d5da09-a866-4784-9c0d-914384019453-kube-api-access-f2cz7\") pod \"auto-csr-approver-29554280-75pf5\" (UID: \"a5d5da09-a866-4784-9c0d-914384019453\") " pod="openshift-infra/auto-csr-approver-29554280-75pf5" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.455252 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554280-75pf5" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.542511 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"4d2171d9-c161-440f-9401-65787d654d6c","Type":"ContainerStarted","Data":"0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1"} Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.569943 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.569925269 podStartE2EDuration="2.569925269s" podCreationTimestamp="2026-03-11 19:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:00.562695605 +0000 UTC m=+1846.210391905" watchObservedRunningTime="2026-03-11 19:20:00.569925269 +0000 UTC m=+1846.217621559" Mar 11 19:20:00 crc kubenswrapper[4842]: I0311 19:20:00.775861 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554280-75pf5"] Mar 11 19:20:01 crc kubenswrapper[4842]: I0311 19:20:01.557538 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554280-75pf5" event={"ID":"a5d5da09-a866-4784-9c0d-914384019453","Type":"ContainerStarted","Data":"1a404b18ce2d17a39643ed868466d01d9ad844a52005a21fe5387fd77e6e219a"} Mar 11 19:20:02 crc kubenswrapper[4842]: I0311 19:20:02.567071 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554280-75pf5" event={"ID":"a5d5da09-a866-4784-9c0d-914384019453","Type":"ContainerStarted","Data":"e87336e6f4fc37d8d521a5d41a2980d13d6cc485191609f08bf992630e6a7c76"} Mar 11 19:20:02 crc kubenswrapper[4842]: I0311 19:20:02.583608 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29554280-75pf5" podStartSLOduration=1.169778042 podStartE2EDuration="2.583590885s" podCreationTimestamp="2026-03-11 19:20:00 +0000 UTC" firstStartedPulling="2026-03-11 19:20:00.770512533 +0000 UTC m=+1846.418208813" lastFinishedPulling="2026-03-11 19:20:02.184325376 +0000 UTC m=+1847.832021656" observedRunningTime="2026-03-11 19:20:02.577330157 +0000 UTC m=+1848.225026437" watchObservedRunningTime="2026-03-11 19:20:02.583590885 +0000 UTC m=+1848.231287165" Mar 11 19:20:02 crc kubenswrapper[4842]: I0311 19:20:02.818540 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.272176 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv"] Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.273842 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.276185 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-scripts" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.276444 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-config-data" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.290087 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv"] Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.362210 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-scripts\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.362289 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8knh6\" (UniqueName: \"kubernetes.io/projected/401ac87c-7d66-4407-8a63-cb5f3012e732-kube-api-access-8knh6\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.362322 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-config-data\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.463942 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-scripts\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.464029 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8knh6\" (UniqueName: \"kubernetes.io/projected/401ac87c-7d66-4407-8a63-cb5f3012e732-kube-api-access-8knh6\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.464073 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-config-data\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.469671 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-scripts\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.480786 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8knh6\" (UniqueName: \"kubernetes.io/projected/401ac87c-7d66-4407-8a63-cb5f3012e732-kube-api-access-8knh6\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.482837 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-config-data\") pod \"nova-kuttl-cell1-cell-mapping-xp9wv\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.578460 4842 generic.go:334] "Generic (PLEG): container finished" podID="a5d5da09-a866-4784-9c0d-914384019453" containerID="e87336e6f4fc37d8d521a5d41a2980d13d6cc485191609f08bf992630e6a7c76" exitCode=0 Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.578506 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554280-75pf5" event={"ID":"a5d5da09-a866-4784-9c0d-914384019453","Type":"ContainerDied","Data":"e87336e6f4fc37d8d521a5d41a2980d13d6cc485191609f08bf992630e6a7c76"} Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.605577 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:03 crc kubenswrapper[4842]: I0311 19:20:03.915301 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.238793 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv"] Mar 11 19:20:04 crc kubenswrapper[4842]: W0311 19:20:04.240032 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod401ac87c_7d66_4407_8a63_cb5f3012e732.slice/crio-a503b750926e18dbc524a3093f67c801b4183b5d94346d6a17160d9434d4dcda WatchSource:0}: Error finding container a503b750926e18dbc524a3093f67c801b4183b5d94346d6a17160d9434d4dcda: Status 404 returned error can't find the container with id a503b750926e18dbc524a3093f67c801b4183b5d94346d6a17160d9434d4dcda Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.588365 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" event={"ID":"401ac87c-7d66-4407-8a63-cb5f3012e732","Type":"ContainerStarted","Data":"59310e84aad42a1f15c234940050ad69e56ab79b8684f43851201e9625bca193"} Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.588796 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" event={"ID":"401ac87c-7d66-4407-8a63-cb5f3012e732","Type":"ContainerStarted","Data":"a503b750926e18dbc524a3093f67c801b4183b5d94346d6a17160d9434d4dcda"} Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.608964 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" podStartSLOduration=1.608947865 podStartE2EDuration="1.608947865s" podCreationTimestamp="2026-03-11 19:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:04.607163517 +0000 UTC m=+1850.254859807" watchObservedRunningTime="2026-03-11 19:20:04.608947865 +0000 UTC m=+1850.256644145" Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.914766 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554280-75pf5" Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.917425 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.917493 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.929529 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:04 crc kubenswrapper[4842]: I0311 19:20:04.932678 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.035197 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2cz7\" (UniqueName: \"kubernetes.io/projected/a5d5da09-a866-4784-9c0d-914384019453-kube-api-access-f2cz7\") pod \"a5d5da09-a866-4784-9c0d-914384019453\" (UID: \"a5d5da09-a866-4784-9c0d-914384019453\") " Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.047973 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d5da09-a866-4784-9c0d-914384019453-kube-api-access-f2cz7" (OuterVolumeSpecName: "kube-api-access-f2cz7") pod "a5d5da09-a866-4784-9c0d-914384019453" (UID: "a5d5da09-a866-4784-9c0d-914384019453"). InnerVolumeSpecName "kube-api-access-f2cz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.137582 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2cz7\" (UniqueName: \"kubernetes.io/projected/a5d5da09-a866-4784-9c0d-914384019453-kube-api-access-f2cz7\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.609974 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554280-75pf5" event={"ID":"a5d5da09-a866-4784-9c0d-914384019453","Type":"ContainerDied","Data":"1a404b18ce2d17a39643ed868466d01d9ad844a52005a21fe5387fd77e6e219a"} Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.610100 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a404b18ce2d17a39643ed868466d01d9ad844a52005a21fe5387fd77e6e219a" Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.610386 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554280-75pf5" Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.663232 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554274-stqtr"] Mar 11 19:20:05 crc kubenswrapper[4842]: I0311 19:20:05.672889 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554274-stqtr"] Mar 11 19:20:06 crc kubenswrapper[4842]: I0311 19:20:06.083501 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:06 crc kubenswrapper[4842]: I0311 19:20:06.083840 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.248:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:06 crc kubenswrapper[4842]: I0311 19:20:06.084431 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:06 crc kubenswrapper[4842]: I0311 19:20:06.084478 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.248:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:06 crc kubenswrapper[4842]: I0311 19:20:06.970470 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a80557-5c7f-4f12-a2a1-60c9d6555fa9" path="/var/lib/kubelet/pods/c7a80557-5c7f-4f12-a2a1-60c9d6555fa9/volumes" Mar 11 19:20:08 crc kubenswrapper[4842]: I0311 19:20:08.904452 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:08 crc kubenswrapper[4842]: I0311 19:20:08.928100 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:09 crc kubenswrapper[4842]: I0311 19:20:09.642322 4842 generic.go:334] "Generic (PLEG): container finished" podID="401ac87c-7d66-4407-8a63-cb5f3012e732" containerID="59310e84aad42a1f15c234940050ad69e56ab79b8684f43851201e9625bca193" exitCode=0 Mar 11 19:20:09 crc kubenswrapper[4842]: I0311 19:20:09.642425 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" event={"ID":"401ac87c-7d66-4407-8a63-cb5f3012e732","Type":"ContainerDied","Data":"59310e84aad42a1f15c234940050ad69e56ab79b8684f43851201e9625bca193"} Mar 11 19:20:09 crc kubenswrapper[4842]: I0311 19:20:09.673560 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:10 crc kubenswrapper[4842]: I0311 19:20:10.967818 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.108248 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.159350 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-config-data\") pod \"401ac87c-7d66-4407-8a63-cb5f3012e732\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.159397 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-scripts\") pod \"401ac87c-7d66-4407-8a63-cb5f3012e732\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.159429 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8knh6\" (UniqueName: \"kubernetes.io/projected/401ac87c-7d66-4407-8a63-cb5f3012e732-kube-api-access-8knh6\") pod \"401ac87c-7d66-4407-8a63-cb5f3012e732\" (UID: \"401ac87c-7d66-4407-8a63-cb5f3012e732\") " Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.165420 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-scripts" (OuterVolumeSpecName: "scripts") pod "401ac87c-7d66-4407-8a63-cb5f3012e732" (UID: "401ac87c-7d66-4407-8a63-cb5f3012e732"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.165601 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401ac87c-7d66-4407-8a63-cb5f3012e732-kube-api-access-8knh6" (OuterVolumeSpecName: "kube-api-access-8knh6") pod "401ac87c-7d66-4407-8a63-cb5f3012e732" (UID: "401ac87c-7d66-4407-8a63-cb5f3012e732"). InnerVolumeSpecName "kube-api-access-8knh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.201028 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-config-data" (OuterVolumeSpecName: "config-data") pod "401ac87c-7d66-4407-8a63-cb5f3012e732" (UID: "401ac87c-7d66-4407-8a63-cb5f3012e732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.261946 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.262361 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/401ac87c-7d66-4407-8a63-cb5f3012e732-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.262373 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8knh6\" (UniqueName: \"kubernetes.io/projected/401ac87c-7d66-4407-8a63-cb5f3012e732-kube-api-access-8knh6\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.665370 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" event={"ID":"401ac87c-7d66-4407-8a63-cb5f3012e732","Type":"ContainerDied","Data":"a503b750926e18dbc524a3093f67c801b4183b5d94346d6a17160d9434d4dcda"} Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.665412 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a503b750926e18dbc524a3093f67c801b4183b5d94346d6a17160d9434d4dcda" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.665469 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv" Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.670989 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"aa35ccc7978f645aca41cd60ffb442586f1c1d0afa2c03aac66ec6981fadd10b"} Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.853207 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.853531 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-log" containerID="cri-o://7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af" gracePeriod=30 Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.853682 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-api" containerID="cri-o://102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74" gracePeriod=30 Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.890218 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.890423 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="4d2171d9-c161-440f-9401-65787d654d6c" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" gracePeriod=30 Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.942181 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.942732 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-log" containerID="cri-o://a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240" gracePeriod=30 Mar 11 19:20:11 crc kubenswrapper[4842]: I0311 19:20:11.942776 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646" gracePeriod=30 Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.683130 4842 generic.go:334] "Generic (PLEG): container finished" podID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerID="a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240" exitCode=143 Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.683212 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"60da412e-bff9-4ad8-838d-3e7c08b814a2","Type":"ContainerDied","Data":"a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240"} Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.685587 4842 generic.go:334] "Generic (PLEG): container finished" podID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerID="7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af" exitCode=143 Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.685608 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"36de4322-50a5-4af7-9eac-8235c9bd9c6b","Type":"ContainerDied","Data":"7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af"} Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.917869 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.917983 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.929131 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:12 crc kubenswrapper[4842]: I0311 19:20:12.929249 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:13 crc kubenswrapper[4842]: E0311 19:20:13.907188 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:20:13 crc kubenswrapper[4842]: E0311 19:20:13.909458 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:20:13 crc kubenswrapper[4842]: E0311 19:20:13.912097 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:20:13 crc kubenswrapper[4842]: E0311 19:20:13.912155 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="4d2171d9-c161-440f-9401-65787d654d6c" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.499310 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.504622 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.634566 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36de4322-50a5-4af7-9eac-8235c9bd9c6b-logs\") pod \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.634635 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36de4322-50a5-4af7-9eac-8235c9bd9c6b-config-data\") pod \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.634680 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60da412e-bff9-4ad8-838d-3e7c08b814a2-config-data\") pod \"60da412e-bff9-4ad8-838d-3e7c08b814a2\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.634740 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfns4\" (UniqueName: \"kubernetes.io/projected/60da412e-bff9-4ad8-838d-3e7c08b814a2-kube-api-access-bfns4\") pod \"60da412e-bff9-4ad8-838d-3e7c08b814a2\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.634776 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60da412e-bff9-4ad8-838d-3e7c08b814a2-logs\") pod \"60da412e-bff9-4ad8-838d-3e7c08b814a2\" (UID: \"60da412e-bff9-4ad8-838d-3e7c08b814a2\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.634896 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dghtv\" (UniqueName: \"kubernetes.io/projected/36de4322-50a5-4af7-9eac-8235c9bd9c6b-kube-api-access-dghtv\") pod \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\" (UID: \"36de4322-50a5-4af7-9eac-8235c9bd9c6b\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.635381 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36de4322-50a5-4af7-9eac-8235c9bd9c6b-logs" (OuterVolumeSpecName: "logs") pod "36de4322-50a5-4af7-9eac-8235c9bd9c6b" (UID: "36de4322-50a5-4af7-9eac-8235c9bd9c6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.635873 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60da412e-bff9-4ad8-838d-3e7c08b814a2-logs" (OuterVolumeSpecName: "logs") pod "60da412e-bff9-4ad8-838d-3e7c08b814a2" (UID: "60da412e-bff9-4ad8-838d-3e7c08b814a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.641463 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36de4322-50a5-4af7-9eac-8235c9bd9c6b-kube-api-access-dghtv" (OuterVolumeSpecName: "kube-api-access-dghtv") pod "36de4322-50a5-4af7-9eac-8235c9bd9c6b" (UID: "36de4322-50a5-4af7-9eac-8235c9bd9c6b"). InnerVolumeSpecName "kube-api-access-dghtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.643049 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60da412e-bff9-4ad8-838d-3e7c08b814a2-kube-api-access-bfns4" (OuterVolumeSpecName: "kube-api-access-bfns4") pod "60da412e-bff9-4ad8-838d-3e7c08b814a2" (UID: "60da412e-bff9-4ad8-838d-3e7c08b814a2"). InnerVolumeSpecName "kube-api-access-bfns4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.661174 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36de4322-50a5-4af7-9eac-8235c9bd9c6b-config-data" (OuterVolumeSpecName: "config-data") pod "36de4322-50a5-4af7-9eac-8235c9bd9c6b" (UID: "36de4322-50a5-4af7-9eac-8235c9bd9c6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.662941 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60da412e-bff9-4ad8-838d-3e7c08b814a2-config-data" (OuterVolumeSpecName: "config-data") pod "60da412e-bff9-4ad8-838d-3e7c08b814a2" (UID: "60da412e-bff9-4ad8-838d-3e7c08b814a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.737033 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60da412e-bff9-4ad8-838d-3e7c08b814a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.737056 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfns4\" (UniqueName: \"kubernetes.io/projected/60da412e-bff9-4ad8-838d-3e7c08b814a2-kube-api-access-bfns4\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.737066 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60da412e-bff9-4ad8-838d-3e7c08b814a2-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.737101 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dghtv\" (UniqueName: \"kubernetes.io/projected/36de4322-50a5-4af7-9eac-8235c9bd9c6b-kube-api-access-dghtv\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.737110 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36de4322-50a5-4af7-9eac-8235c9bd9c6b-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.737119 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36de4322-50a5-4af7-9eac-8235c9bd9c6b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.740893 4842 generic.go:334] "Generic (PLEG): container finished" podID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerID="a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646" exitCode=0 Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.740969 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"60da412e-bff9-4ad8-838d-3e7c08b814a2","Type":"ContainerDied","Data":"a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646"} Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.740980 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.741001 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"60da412e-bff9-4ad8-838d-3e7c08b814a2","Type":"ContainerDied","Data":"090d772b18ef06a1b374c1dcbe6b8e1d93a096a15b7e6fec840b6d18ba8545f8"} Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.741029 4842 scope.go:117] "RemoveContainer" containerID="a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.746534 4842 generic.go:334] "Generic (PLEG): container finished" podID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerID="102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74" exitCode=0 Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.746692 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"36de4322-50a5-4af7-9eac-8235c9bd9c6b","Type":"ContainerDied","Data":"102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74"} Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.746742 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"36de4322-50a5-4af7-9eac-8235c9bd9c6b","Type":"ContainerDied","Data":"268c697e73829f633874a909b318abfcec3a8d83dd7d1a30ca7e2dbc1a5889e6"} Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.746749 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.768729 4842 scope.go:117] "RemoveContainer" containerID="a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.804509 4842 scope.go:117] "RemoveContainer" containerID="a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.804604 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.805088 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646\": container with ID starting with a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646 not found: ID does not exist" containerID="a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.805120 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646"} err="failed to get container status \"a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646\": rpc error: code = NotFound desc = could not find container \"a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646\": container with ID starting with a09bedb6338a0f6a00adbe54a9c56e6ccb54f525b01ba42dbaff2cfaf9fdc646 not found: ID does not exist" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.805140 4842 scope.go:117] "RemoveContainer" containerID="a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.805369 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240\": container with ID starting with a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240 not found: ID does not exist" containerID="a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.805385 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240"} err="failed to get container status \"a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240\": rpc error: code = NotFound desc = could not find container \"a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240\": container with ID starting with a80ed3bda2f63f48218ff35d6e718ca38d595951cfa27507caf323a746838240 not found: ID does not exist" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.805395 4842 scope.go:117] "RemoveContainer" containerID="102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.821353 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.838789 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.839137 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d5da09-a866-4784-9c0d-914384019453" containerName="oc" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839149 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d5da09-a866-4784-9c0d-914384019453" containerName="oc" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.839163 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401ac87c-7d66-4407-8a63-cb5f3012e732" containerName="nova-manage" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839169 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="401ac87c-7d66-4407-8a63-cb5f3012e732" containerName="nova-manage" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.839185 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-log" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839191 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-log" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.839208 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-metadata" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839214 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-metadata" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.839233 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-log" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839239 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-log" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.839249 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-api" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839255 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-api" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839410 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-log" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839422 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-log" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839431 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" containerName="nova-kuttl-metadata-metadata" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839443 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="401ac87c-7d66-4407-8a63-cb5f3012e732" containerName="nova-manage" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839453 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" containerName="nova-kuttl-api-api" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.839462 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d5da09-a866-4784-9c0d-914384019453" containerName="oc" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.843321 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.850941 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.858026 4842 scope.go:117] "RemoveContainer" containerID="7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.866533 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.878003 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.885939 4842 scope.go:117] "RemoveContainer" containerID="102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.886146 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74\": container with ID starting with 102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74 not found: ID does not exist" containerID="102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.886168 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74"} err="failed to get container status \"102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74\": rpc error: code = NotFound desc = could not find container \"102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74\": container with ID starting with 102176cc29c8ae8d8a9c5bf9bcccc00c8bfecf86de1f2b6149567df4c46ebd74 not found: ID does not exist" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.886188 4842 scope.go:117] "RemoveContainer" containerID="7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:15.886341 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af\": container with ID starting with 7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af not found: ID does not exist" containerID="7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.886356 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af"} err="failed to get container status \"7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af\": rpc error: code = NotFound desc = could not find container \"7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af\": container with ID starting with 7e80b0b9a06a701fd4e2f308006da62e80f40282b147543da353752b4706b6af not found: ID does not exist" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.892154 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.893416 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.897313 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.897686 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.903243 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.939020 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.939095 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:15.939139 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hdq\" (UniqueName: \"kubernetes.io/projected/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-kube-api-access-h4hdq\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.040671 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.041020 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25f746b-58f8-48ec-8f73-f367a6148143-config-data\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.041044 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.041068 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d25f746b-58f8-48ec-8f73-f367a6148143-logs\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.041096 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xshk\" (UniqueName: \"kubernetes.io/projected/d25f746b-58f8-48ec-8f73-f367a6148143-kube-api-access-2xshk\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.041177 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hdq\" (UniqueName: \"kubernetes.io/projected/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-kube-api-access-h4hdq\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.041290 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.047737 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.058044 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hdq\" (UniqueName: \"kubernetes.io/projected/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-kube-api-access-h4hdq\") pod \"nova-kuttl-metadata-0\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.143035 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d25f746b-58f8-48ec-8f73-f367a6148143-logs\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.143532 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xshk\" (UniqueName: \"kubernetes.io/projected/d25f746b-58f8-48ec-8f73-f367a6148143-kube-api-access-2xshk\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.143648 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25f746b-58f8-48ec-8f73-f367a6148143-config-data\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.144202 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d25f746b-58f8-48ec-8f73-f367a6148143-logs\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.148684 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25f746b-58f8-48ec-8f73-f367a6148143-config-data\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.168817 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xshk\" (UniqueName: \"kubernetes.io/projected/d25f746b-58f8-48ec-8f73-f367a6148143-kube-api-access-2xshk\") pod \"nova-kuttl-api-0\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.181205 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.217065 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.424082 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.552879 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twzh4\" (UniqueName: \"kubernetes.io/projected/4d2171d9-c161-440f-9401-65787d654d6c-kube-api-access-twzh4\") pod \"4d2171d9-c161-440f-9401-65787d654d6c\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.552949 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2171d9-c161-440f-9401-65787d654d6c-config-data\") pod \"4d2171d9-c161-440f-9401-65787d654d6c\" (UID: \"4d2171d9-c161-440f-9401-65787d654d6c\") " Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.559470 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2171d9-c161-440f-9401-65787d654d6c-kube-api-access-twzh4" (OuterVolumeSpecName: "kube-api-access-twzh4") pod "4d2171d9-c161-440f-9401-65787d654d6c" (UID: "4d2171d9-c161-440f-9401-65787d654d6c"). InnerVolumeSpecName "kube-api-access-twzh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.582483 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d2171d9-c161-440f-9401-65787d654d6c-config-data" (OuterVolumeSpecName: "config-data") pod "4d2171d9-c161-440f-9401-65787d654d6c" (UID: "4d2171d9-c161-440f-9401-65787d654d6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.655627 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twzh4\" (UniqueName: \"kubernetes.io/projected/4d2171d9-c161-440f-9401-65787d654d6c-kube-api-access-twzh4\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.655658 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2171d9-c161-440f-9401-65787d654d6c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.676917 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: W0311 19:20:16.683913 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c7aaf72_23a2_4399_af3c_2997d9a88e0e.slice/crio-576d6b1b313ca26ae4fc90296d46f54a54333ca72c161a96a6b39a15c7c6a071 WatchSource:0}: Error finding container 576d6b1b313ca26ae4fc90296d46f54a54333ca72c161a96a6b39a15c7c6a071: Status 404 returned error can't find the container with id 576d6b1b313ca26ae4fc90296d46f54a54333ca72c161a96a6b39a15c7c6a071 Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.761958 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.767547 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8c7aaf72-23a2-4399-af3c-2997d9a88e0e","Type":"ContainerStarted","Data":"576d6b1b313ca26ae4fc90296d46f54a54333ca72c161a96a6b39a15c7c6a071"} Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.769723 4842 generic.go:334] "Generic (PLEG): container finished" podID="4d2171d9-c161-440f-9401-65787d654d6c" containerID="0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" exitCode=0 Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.769799 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.769812 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"4d2171d9-c161-440f-9401-65787d654d6c","Type":"ContainerDied","Data":"0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1"} Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.770764 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"4d2171d9-c161-440f-9401-65787d654d6c","Type":"ContainerDied","Data":"fc80b7f8c35863cc9e1f5a20eae544d32cf4aeec559eba9107d530ce4ed5bfae"} Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.770794 4842 scope.go:117] "RemoveContainer" containerID="0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" Mar 11 19:20:16 crc kubenswrapper[4842]: W0311 19:20:16.782213 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd25f746b_58f8_48ec_8f73_f367a6148143.slice/crio-8958b884638366e3729a849485d6786b8c16548ec71f9c9cf16e3b0350967b27 WatchSource:0}: Error finding container 8958b884638366e3729a849485d6786b8c16548ec71f9c9cf16e3b0350967b27: Status 404 returned error can't find the container with id 8958b884638366e3729a849485d6786b8c16548ec71f9c9cf16e3b0350967b27 Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.828701 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.835215 4842 scope.go:117] "RemoveContainer" containerID="0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:16.836493 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1\": container with ID starting with 0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1 not found: ID does not exist" containerID="0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.836545 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1"} err="failed to get container status \"0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1\": rpc error: code = NotFound desc = could not find container \"0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1\": container with ID starting with 0778218c7d524c347262bbbe9b6e2d38e2ea4e7730c5dfd2c81ee99c3d8d09e1 not found: ID does not exist" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.839384 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.857350 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: E0311 19:20:16.857666 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2171d9-c161-440f-9401-65787d654d6c" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.857680 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2171d9-c161-440f-9401-65787d654d6c" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.857831 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2171d9-c161-440f-9401-65787d654d6c" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.858379 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.858454 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.863130 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.961032 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434157e1-507f-4acd-b1ad-b8c8add9d863-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.961087 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbcgw\" (UniqueName: \"kubernetes.io/projected/434157e1-507f-4acd-b1ad-b8c8add9d863-kube-api-access-wbcgw\") pod \"nova-kuttl-scheduler-0\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.971757 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36de4322-50a5-4af7-9eac-8235c9bd9c6b" path="/var/lib/kubelet/pods/36de4322-50a5-4af7-9eac-8235c9bd9c6b/volumes" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.972453 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2171d9-c161-440f-9401-65787d654d6c" path="/var/lib/kubelet/pods/4d2171d9-c161-440f-9401-65787d654d6c/volumes" Mar 11 19:20:16 crc kubenswrapper[4842]: I0311 19:20:16.973439 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60da412e-bff9-4ad8-838d-3e7c08b814a2" path="/var/lib/kubelet/pods/60da412e-bff9-4ad8-838d-3e7c08b814a2/volumes" Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.063153 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434157e1-507f-4acd-b1ad-b8c8add9d863-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.064451 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbcgw\" (UniqueName: \"kubernetes.io/projected/434157e1-507f-4acd-b1ad-b8c8add9d863-kube-api-access-wbcgw\") pod \"nova-kuttl-scheduler-0\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.069196 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434157e1-507f-4acd-b1ad-b8c8add9d863-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.106799 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbcgw\" (UniqueName: \"kubernetes.io/projected/434157e1-507f-4acd-b1ad-b8c8add9d863-kube-api-access-wbcgw\") pod \"nova-kuttl-scheduler-0\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.196620 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.652183 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.782900 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8c7aaf72-23a2-4399-af3c-2997d9a88e0e","Type":"ContainerStarted","Data":"9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215"} Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.782972 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8c7aaf72-23a2-4399-af3c-2997d9a88e0e","Type":"ContainerStarted","Data":"daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea"} Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.786991 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"434157e1-507f-4acd-b1ad-b8c8add9d863","Type":"ContainerStarted","Data":"6ad7bd17a843742cc0c53bd876a54e895d6e206ca379221913e565aa8c2fa860"} Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.790116 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"d25f746b-58f8-48ec-8f73-f367a6148143","Type":"ContainerStarted","Data":"bb3cf014b033909853f63dc9436857d7962c07b4e9e816dea85324527c2bd4f5"} Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.790149 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"d25f746b-58f8-48ec-8f73-f367a6148143","Type":"ContainerStarted","Data":"c52d1549df7ed945b169cd516fa77847b9096f38e9cc5fa49b918dbee880d998"} Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.790164 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"d25f746b-58f8-48ec-8f73-f367a6148143","Type":"ContainerStarted","Data":"8958b884638366e3729a849485d6786b8c16548ec71f9c9cf16e3b0350967b27"} Mar 11 19:20:17 crc kubenswrapper[4842]: I0311 19:20:17.815827 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.8158057039999997 podStartE2EDuration="2.815805704s" podCreationTimestamp="2026-03-11 19:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:17.801800629 +0000 UTC m=+1863.449496949" watchObservedRunningTime="2026-03-11 19:20:17.815805704 +0000 UTC m=+1863.463501994" Mar 11 19:20:18 crc kubenswrapper[4842]: I0311 19:20:18.801339 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"434157e1-507f-4acd-b1ad-b8c8add9d863","Type":"ContainerStarted","Data":"65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818"} Mar 11 19:20:18 crc kubenswrapper[4842]: I0311 19:20:18.832150 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=3.832116097 podStartE2EDuration="3.832116097s" podCreationTimestamp="2026-03-11 19:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:17.828677009 +0000 UTC m=+1863.476373299" watchObservedRunningTime="2026-03-11 19:20:18.832116097 +0000 UTC m=+1864.479812407" Mar 11 19:20:18 crc kubenswrapper[4842]: I0311 19:20:18.836174 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.836159655 podStartE2EDuration="2.836159655s" podCreationTimestamp="2026-03-11 19:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:18.828740736 +0000 UTC m=+1864.476437066" watchObservedRunningTime="2026-03-11 19:20:18.836159655 +0000 UTC m=+1864.483855965" Mar 11 19:20:22 crc kubenswrapper[4842]: I0311 19:20:22.196864 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:26 crc kubenswrapper[4842]: I0311 19:20:26.181341 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:26 crc kubenswrapper[4842]: I0311 19:20:26.181783 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:26 crc kubenswrapper[4842]: I0311 19:20:26.217745 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:26 crc kubenswrapper[4842]: I0311 19:20:26.217818 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:27 crc kubenswrapper[4842]: I0311 19:20:27.197749 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:27 crc kubenswrapper[4842]: I0311 19:20:27.230060 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:27 crc kubenswrapper[4842]: I0311 19:20:27.346837 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.0.252:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:27 crc kubenswrapper[4842]: I0311 19:20:27.346895 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.252:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:27 crc kubenswrapper[4842]: I0311 19:20:27.346999 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:27 crc kubenswrapper[4842]: I0311 19:20:27.347560 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:20:27 crc kubenswrapper[4842]: I0311 19:20:27.919958 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:29 crc kubenswrapper[4842]: I0311 19:20:29.855294 4842 scope.go:117] "RemoveContainer" containerID="7ad6f2f52233b63417b42ea2e13ffc051490e74f2473fa21d11a8706cb2cb3a6" Mar 11 19:20:29 crc kubenswrapper[4842]: I0311 19:20:29.880429 4842 scope.go:117] "RemoveContainer" containerID="595f1f96e5649954853bf82aca7f1546a636c69c45cca8e454c1919142527ee8" Mar 11 19:20:29 crc kubenswrapper[4842]: I0311 19:20:29.957083 4842 scope.go:117] "RemoveContainer" containerID="dd94d5f5babba55d95fa252012f301fd612f1f3e22e6c4287831053ffc293669" Mar 11 19:20:30 crc kubenswrapper[4842]: I0311 19:20:30.009166 4842 scope.go:117] "RemoveContainer" containerID="de126fc9d74f3844e49ae78aad19ca0ca51e5d2f1483c30709e045c2c3f82936" Mar 11 19:20:30 crc kubenswrapper[4842]: I0311 19:20:30.055091 4842 scope.go:117] "RemoveContainer" containerID="17185d003426a59e826a7a36a128aa44c461dfa1f1d4a03d090fbd48ddb2e352" Mar 11 19:20:30 crc kubenswrapper[4842]: I0311 19:20:30.078233 4842 scope.go:117] "RemoveContainer" containerID="5973b07023f76418605b1d1a497ef3a022a1ae8c731d75c6374fd0cfefc88d8b" Mar 11 19:20:34 crc kubenswrapper[4842]: I0311 19:20:34.182014 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:34 crc kubenswrapper[4842]: I0311 19:20:34.182415 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:34 crc kubenswrapper[4842]: I0311 19:20:34.219069 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:34 crc kubenswrapper[4842]: I0311 19:20:34.219170 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:36 crc kubenswrapper[4842]: I0311 19:20:36.186739 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:36 crc kubenswrapper[4842]: I0311 19:20:36.187640 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:36 crc kubenswrapper[4842]: I0311 19:20:36.190945 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:36 crc kubenswrapper[4842]: I0311 19:20:36.227469 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:36 crc kubenswrapper[4842]: I0311 19:20:36.230232 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:36 crc kubenswrapper[4842]: I0311 19:20:36.232044 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:37 crc kubenswrapper[4842]: I0311 19:20:37.001371 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:37 crc kubenswrapper[4842]: I0311 19:20:37.005977 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.880452 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv"] Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.886178 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-xp9wv"] Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.900511 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg"] Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.909187 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-jcpbg"] Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.939796 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novaapib1de-account-delete-fqcgn"] Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.940768 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.960506 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapib1de-account-delete-fqcgn"] Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.971677 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401ac87c-7d66-4407-8a63-cb5f3012e732" path="/var/lib/kubelet/pods/401ac87c-7d66-4407-8a63-cb5f3012e732/volumes" Mar 11 19:20:42 crc kubenswrapper[4842]: I0311 19:20:42.972852 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29d634a-779e-4248-b485-a586c702d4b4" path="/var/lib/kubelet/pods/c29d634a-779e-4248-b485-a586c702d4b4/volumes" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.024706 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell00a69-account-delete-dsnhz"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.029825 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.039772 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zhtf\" (UniqueName: \"kubernetes.io/projected/49b8f2b3-b127-40ab-925b-4bec30c47198-kube-api-access-7zhtf\") pod \"novaapib1de-account-delete-fqcgn\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.040023 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b8f2b3-b127-40ab-925b-4bec30c47198-operator-scripts\") pod \"novaapib1de-account-delete-fqcgn\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.046363 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell00a69-account-delete-dsnhz"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.099902 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.100584 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="434157e1-507f-4acd-b1ad-b8c8add9d863" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.121845 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell1882e-account-delete-v24zl"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.122853 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.141759 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zhtf\" (UniqueName: \"kubernetes.io/projected/49b8f2b3-b127-40ab-925b-4bec30c47198-kube-api-access-7zhtf\") pod \"novaapib1de-account-delete-fqcgn\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.141809 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b8f2b3-b127-40ab-925b-4bec30c47198-operator-scripts\") pod \"novaapib1de-account-delete-fqcgn\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.141835 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knr7c\" (UniqueName: \"kubernetes.io/projected/b9929547-5069-4ebb-985e-18afea3f0641-kube-api-access-knr7c\") pod \"novacell00a69-account-delete-dsnhz\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.141935 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9929547-5069-4ebb-985e-18afea3f0641-operator-scripts\") pod \"novacell00a69-account-delete-dsnhz\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.143214 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b8f2b3-b127-40ab-925b-4bec30c47198-operator-scripts\") pod \"novaapib1de-account-delete-fqcgn\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.159126 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1882e-account-delete-v24zl"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.166210 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.169571 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" containerName="nova-kuttl-cell1-novncproxy-novncproxy" containerID="cri-o://93812d34795315ed6f64e2f14a3507fe676d9f9021270ba164b62bc3152489a6" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.185575 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zhtf\" (UniqueName: \"kubernetes.io/projected/49b8f2b3-b127-40ab-925b-4bec30c47198-kube-api-access-7zhtf\") pod \"novaapib1de-account-delete-fqcgn\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.230310 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.230532 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="e62ac3a6-d61b-47b0-a91f-0772398f3ddc" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://18240a5aba4e417e66b11f8e254e8b82251caf43c2f4b8a02303a313fe3c2828" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.232894 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.233156 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-log" containerID="cri-o://daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.233333 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.243255 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b66ea655-2e59-4c9c-a0a3-710d38cab666-operator-scripts\") pod \"novacell1882e-account-delete-v24zl\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.243356 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knr7c\" (UniqueName: \"kubernetes.io/projected/b9929547-5069-4ebb-985e-18afea3f0641-kube-api-access-knr7c\") pod \"novacell00a69-account-delete-dsnhz\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.243418 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tz6w\" (UniqueName: \"kubernetes.io/projected/b66ea655-2e59-4c9c-a0a3-710d38cab666-kube-api-access-2tz6w\") pod \"novacell1882e-account-delete-v24zl\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.243465 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9929547-5069-4ebb-985e-18afea3f0641-operator-scripts\") pod \"novacell00a69-account-delete-dsnhz\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.244110 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9929547-5069-4ebb-985e-18afea3f0641-operator-scripts\") pod \"novacell00a69-account-delete-dsnhz\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.250430 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.258611 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.262170 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-gmxxz"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.300321 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.310073 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knr7c\" (UniqueName: \"kubernetes.io/projected/b9929547-5069-4ebb-985e-18afea3f0641-kube-api-access-knr7c\") pod \"novacell00a69-account-delete-dsnhz\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.316550 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.316740 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.320915 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-tk4f2"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.325916 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.326123 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-log" containerID="cri-o://c52d1549df7ed945b169cd516fa77847b9096f38e9cc5fa49b918dbee880d998" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.326674 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-api" containerID="cri-o://bb3cf014b033909853f63dc9436857d7962c07b4e9e816dea85324527c2bd4f5" gracePeriod=30 Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.347845 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tz6w\" (UniqueName: \"kubernetes.io/projected/b66ea655-2e59-4c9c-a0a3-710d38cab666-kube-api-access-2tz6w\") pod \"novacell1882e-account-delete-v24zl\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.348398 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b66ea655-2e59-4c9c-a0a3-710d38cab666-operator-scripts\") pod \"novacell1882e-account-delete-v24zl\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.349297 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b66ea655-2e59-4c9c-a0a3-710d38cab666-operator-scripts\") pod \"novacell1882e-account-delete-v24zl\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.353847 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.383901 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tz6w\" (UniqueName: \"kubernetes.io/projected/b66ea655-2e59-4c9c-a0a3-710d38cab666-kube-api-access-2tz6w\") pod \"novacell1882e-account-delete-v24zl\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.441886 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.828278 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapib1de-account-delete-fqcgn"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.914596 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell00a69-account-delete-dsnhz"] Mar 11 19:20:43 crc kubenswrapper[4842]: I0311 19:20:43.980598 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell1882e-account-delete-v24zl"] Mar 11 19:20:44 crc kubenswrapper[4842]: W0311 19:20:44.037403 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb66ea655_2e59_4c9c_a0a3_710d38cab666.slice/crio-6ccb635ad7f3cdd59c610618af9e2ac4ec64748e7a7ad86ab86ff45552e3310f WatchSource:0}: Error finding container 6ccb635ad7f3cdd59c610618af9e2ac4ec64748e7a7ad86ab86ff45552e3310f: Status 404 returned error can't find the container with id 6ccb635ad7f3cdd59c610618af9e2ac4ec64748e7a7ad86ab86ff45552e3310f Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.075782 4842 generic.go:334] "Generic (PLEG): container finished" podID="d25f746b-58f8-48ec-8f73-f367a6148143" containerID="c52d1549df7ed945b169cd516fa77847b9096f38e9cc5fa49b918dbee880d998" exitCode=143 Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.075834 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"d25f746b-58f8-48ec-8f73-f367a6148143","Type":"ContainerDied","Data":"c52d1549df7ed945b169cd516fa77847b9096f38e9cc5fa49b918dbee880d998"} Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.080065 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" event={"ID":"b9929547-5069-4ebb-985e-18afea3f0641","Type":"ContainerStarted","Data":"57a98fb69a56cb9cae10468293909c6609332607e8db5b4ebcc431eceefec5d0"} Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.102117 4842 generic.go:334] "Generic (PLEG): container finished" podID="cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" containerID="93812d34795315ed6f64e2f14a3507fe676d9f9021270ba164b62bc3152489a6" exitCode=0 Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.102235 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a","Type":"ContainerDied","Data":"93812d34795315ed6f64e2f14a3507fe676d9f9021270ba164b62bc3152489a6"} Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.114061 4842 generic.go:334] "Generic (PLEG): container finished" podID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerID="daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea" exitCode=143 Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.114138 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8c7aaf72-23a2-4399-af3c-2997d9a88e0e","Type":"ContainerDied","Data":"daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea"} Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.115910 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" event={"ID":"b66ea655-2e59-4c9c-a0a3-710d38cab666","Type":"ContainerStarted","Data":"6ccb635ad7f3cdd59c610618af9e2ac4ec64748e7a7ad86ab86ff45552e3310f"} Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.117305 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" event={"ID":"49b8f2b3-b127-40ab-925b-4bec30c47198","Type":"ContainerStarted","Data":"038a1fdddc46f534bc00daf359efedf0d6cabbf8e304e639006e1a2d50f69ebe"} Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.117331 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" event={"ID":"49b8f2b3-b127-40ab-925b-4bec30c47198","Type":"ContainerStarted","Data":"c9f00e72cb149e70f235d30422b8e719c52e4dfe87b1d991fb3fd209a7ab60d7"} Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.139307 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" podStartSLOduration=2.139293674 podStartE2EDuration="2.139293674s" podCreationTimestamp="2026-03-11 19:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:44.134875817 +0000 UTC m=+1889.782572097" watchObservedRunningTime="2026-03-11 19:20:44.139293674 +0000 UTC m=+1889.786989954" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.517683 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:44 crc kubenswrapper[4842]: E0311 19:20:44.640424 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:20:44 crc kubenswrapper[4842]: E0311 19:20:44.652456 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:20:44 crc kubenswrapper[4842]: E0311 19:20:44.654091 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:20:44 crc kubenswrapper[4842]: E0311 19:20:44.654150 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.673198 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kksv4\" (UniqueName: \"kubernetes.io/projected/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-kube-api-access-kksv4\") pod \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.673315 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-config-data\") pod \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\" (UID: \"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a\") " Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.693151 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-kube-api-access-kksv4" (OuterVolumeSpecName: "kube-api-access-kksv4") pod "cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" (UID: "cc1823b2-7f89-47ff-936d-1b2ad52f5d7a"). InnerVolumeSpecName "kube-api-access-kksv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.703598 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-config-data" (OuterVolumeSpecName: "config-data") pod "cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" (UID: "cc1823b2-7f89-47ff-936d-1b2ad52f5d7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.775107 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kksv4\" (UniqueName: \"kubernetes.io/projected/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-kube-api-access-kksv4\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.775141 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.972962 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.977391 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190ff3a6-1b86-490e-9e67-324a02a35b55" path="/var/lib/kubelet/pods/190ff3a6-1b86-490e-9e67-324a02a35b55/volumes" Mar 11 19:20:44 crc kubenswrapper[4842]: I0311 19:20:44.978454 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fe11e4-c139-4cab-bcc4-989b2e2fb979" path="/var/lib/kubelet/pods/a7fe11e4-c139-4cab-bcc4-989b2e2fb979/volumes" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.080089 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbcgw\" (UniqueName: \"kubernetes.io/projected/434157e1-507f-4acd-b1ad-b8c8add9d863-kube-api-access-wbcgw\") pod \"434157e1-507f-4acd-b1ad-b8c8add9d863\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.080208 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434157e1-507f-4acd-b1ad-b8c8add9d863-config-data\") pod \"434157e1-507f-4acd-b1ad-b8c8add9d863\" (UID: \"434157e1-507f-4acd-b1ad-b8c8add9d863\") " Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.085852 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434157e1-507f-4acd-b1ad-b8c8add9d863-kube-api-access-wbcgw" (OuterVolumeSpecName: "kube-api-access-wbcgw") pod "434157e1-507f-4acd-b1ad-b8c8add9d863" (UID: "434157e1-507f-4acd-b1ad-b8c8add9d863"). InnerVolumeSpecName "kube-api-access-wbcgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.116065 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/434157e1-507f-4acd-b1ad-b8c8add9d863-config-data" (OuterVolumeSpecName: "config-data") pod "434157e1-507f-4acd-b1ad-b8c8add9d863" (UID: "434157e1-507f-4acd-b1ad-b8c8add9d863"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.128751 4842 generic.go:334] "Generic (PLEG): container finished" podID="49b8f2b3-b127-40ab-925b-4bec30c47198" containerID="038a1fdddc46f534bc00daf359efedf0d6cabbf8e304e639006e1a2d50f69ebe" exitCode=0 Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.128844 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" event={"ID":"49b8f2b3-b127-40ab-925b-4bec30c47198","Type":"ContainerDied","Data":"038a1fdddc46f534bc00daf359efedf0d6cabbf8e304e639006e1a2d50f69ebe"} Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.130906 4842 generic.go:334] "Generic (PLEG): container finished" podID="434157e1-507f-4acd-b1ad-b8c8add9d863" containerID="65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818" exitCode=0 Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.130951 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.130993 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"434157e1-507f-4acd-b1ad-b8c8add9d863","Type":"ContainerDied","Data":"65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818"} Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.131033 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"434157e1-507f-4acd-b1ad-b8c8add9d863","Type":"ContainerDied","Data":"6ad7bd17a843742cc0c53bd876a54e895d6e206ca379221913e565aa8c2fa860"} Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.131050 4842 scope.go:117] "RemoveContainer" containerID="65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.133853 4842 generic.go:334] "Generic (PLEG): container finished" podID="b9929547-5069-4ebb-985e-18afea3f0641" containerID="c385a966091517d2c86c87eba300536ca997fafdd28136caf47a7b41b884aab0" exitCode=0 Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.133938 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" event={"ID":"b9929547-5069-4ebb-985e-18afea3f0641","Type":"ContainerDied","Data":"c385a966091517d2c86c87eba300536ca997fafdd28136caf47a7b41b884aab0"} Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.135948 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"cc1823b2-7f89-47ff-936d-1b2ad52f5d7a","Type":"ContainerDied","Data":"1df350bec6b2985a864cda0bbcd1ab2f72fe7d59973b1644dc869bd95fd21620"} Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.136639 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.153957 4842 generic.go:334] "Generic (PLEG): container finished" podID="b66ea655-2e59-4c9c-a0a3-710d38cab666" containerID="e6aba4d908ecd1c794034cb32cb88dff065808bc22b031ecfcf56dcacb306032" exitCode=0 Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.154003 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" event={"ID":"b66ea655-2e59-4c9c-a0a3-710d38cab666","Type":"ContainerDied","Data":"e6aba4d908ecd1c794034cb32cb88dff065808bc22b031ecfcf56dcacb306032"} Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.181864 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/434157e1-507f-4acd-b1ad-b8c8add9d863-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.181909 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbcgw\" (UniqueName: \"kubernetes.io/projected/434157e1-507f-4acd-b1ad-b8c8add9d863-kube-api-access-wbcgw\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.183586 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.184195 4842 scope.go:117] "RemoveContainer" containerID="65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818" Mar 11 19:20:45 crc kubenswrapper[4842]: E0311 19:20:45.184751 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818\": container with ID starting with 65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818 not found: ID does not exist" containerID="65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.184786 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818"} err="failed to get container status \"65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818\": rpc error: code = NotFound desc = could not find container \"65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818\": container with ID starting with 65ff5190e4708f46f6ab113db2d9404ecb8409f3760904a5e6eed11c85b4c818 not found: ID does not exist" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.184807 4842 scope.go:117] "RemoveContainer" containerID="93812d34795315ed6f64e2f14a3507fe676d9f9021270ba164b62bc3152489a6" Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.192292 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.207917 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.222817 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:20:45 crc kubenswrapper[4842]: I0311 19:20:45.982864 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.105877 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-config-data\") pod \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.106104 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhk2s\" (UniqueName: \"kubernetes.io/projected/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-kube-api-access-dhk2s\") pod \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\" (UID: \"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142\") " Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.125940 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-kube-api-access-dhk2s" (OuterVolumeSpecName: "kube-api-access-dhk2s") pod "d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" (UID: "d067782b-b6da-4dcc-a0a9-0ecbdcfcf142"). InnerVolumeSpecName "kube-api-access-dhk2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.137494 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-config-data" (OuterVolumeSpecName: "config-data") pod "d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" (UID: "d067782b-b6da-4dcc-a0a9-0ecbdcfcf142"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.215970 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.216018 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhk2s\" (UniqueName: \"kubernetes.io/projected/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142-kube-api-access-dhk2s\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.252314 4842 generic.go:334] "Generic (PLEG): container finished" podID="d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" containerID="511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" exitCode=0 Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.252474 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.252973 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142","Type":"ContainerDied","Data":"511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b"} Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.253046 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"d067782b-b6da-4dcc-a0a9-0ecbdcfcf142","Type":"ContainerDied","Data":"8de75d25ecd546f2b0f0f2db4b81ab6b1c7d0aab2a6cbd6cce92ddb3269141c2"} Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.253071 4842 scope.go:117] "RemoveContainer" containerID="511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.305373 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.312991 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.327449 4842 scope.go:117] "RemoveContainer" containerID="511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" Mar 11 19:20:46 crc kubenswrapper[4842]: E0311 19:20:46.333386 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b\": container with ID starting with 511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b not found: ID does not exist" containerID="511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.333426 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b"} err="failed to get container status \"511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b\": rpc error: code = NotFound desc = could not find container \"511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b\": container with ID starting with 511d0594f736aca22ac8c969111971c85e25b1233585135f218d7bbec897c06b not found: ID does not exist" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.661143 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.822897 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b66ea655-2e59-4c9c-a0a3-710d38cab666-operator-scripts\") pod \"b66ea655-2e59-4c9c-a0a3-710d38cab666\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.823002 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tz6w\" (UniqueName: \"kubernetes.io/projected/b66ea655-2e59-4c9c-a0a3-710d38cab666-kube-api-access-2tz6w\") pod \"b66ea655-2e59-4c9c-a0a3-710d38cab666\" (UID: \"b66ea655-2e59-4c9c-a0a3-710d38cab666\") " Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.824580 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b66ea655-2e59-4c9c-a0a3-710d38cab666-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b66ea655-2e59-4c9c-a0a3-710d38cab666" (UID: "b66ea655-2e59-4c9c-a0a3-710d38cab666"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.830068 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66ea655-2e59-4c9c-a0a3-710d38cab666-kube-api-access-2tz6w" (OuterVolumeSpecName: "kube-api-access-2tz6w") pod "b66ea655-2e59-4c9c-a0a3-710d38cab666" (UID: "b66ea655-2e59-4c9c-a0a3-710d38cab666"). InnerVolumeSpecName "kube-api-access-2tz6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.923941 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.926395 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b66ea655-2e59-4c9c-a0a3-710d38cab666-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.926438 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tz6w\" (UniqueName: \"kubernetes.io/projected/b66ea655-2e59-4c9c-a0a3-710d38cab666-kube-api-access-2tz6w\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.930506 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.972307 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="434157e1-507f-4acd-b1ad-b8c8add9d863" path="/var/lib/kubelet/pods/434157e1-507f-4acd-b1ad-b8c8add9d863/volumes" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.973103 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" path="/var/lib/kubelet/pods/cc1823b2-7f89-47ff-936d-1b2ad52f5d7a/volumes" Mar 11 19:20:46 crc kubenswrapper[4842]: I0311 19:20:46.973726 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" path="/var/lib/kubelet/pods/d067782b-b6da-4dcc-a0a9-0ecbdcfcf142/volumes" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.030222 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zhtf\" (UniqueName: \"kubernetes.io/projected/49b8f2b3-b127-40ab-925b-4bec30c47198-kube-api-access-7zhtf\") pod \"49b8f2b3-b127-40ab-925b-4bec30c47198\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.030488 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b8f2b3-b127-40ab-925b-4bec30c47198-operator-scripts\") pod \"49b8f2b3-b127-40ab-925b-4bec30c47198\" (UID: \"49b8f2b3-b127-40ab-925b-4bec30c47198\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.030529 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knr7c\" (UniqueName: \"kubernetes.io/projected/b9929547-5069-4ebb-985e-18afea3f0641-kube-api-access-knr7c\") pod \"b9929547-5069-4ebb-985e-18afea3f0641\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.030569 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9929547-5069-4ebb-985e-18afea3f0641-operator-scripts\") pod \"b9929547-5069-4ebb-985e-18afea3f0641\" (UID: \"b9929547-5069-4ebb-985e-18afea3f0641\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.031092 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b8f2b3-b127-40ab-925b-4bec30c47198-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49b8f2b3-b127-40ab-925b-4bec30c47198" (UID: "49b8f2b3-b127-40ab-925b-4bec30c47198"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.031351 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9929547-5069-4ebb-985e-18afea3f0641-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9929547-5069-4ebb-985e-18afea3f0641" (UID: "b9929547-5069-4ebb-985e-18afea3f0641"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.034952 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9929547-5069-4ebb-985e-18afea3f0641-kube-api-access-knr7c" (OuterVolumeSpecName: "kube-api-access-knr7c") pod "b9929547-5069-4ebb-985e-18afea3f0641" (UID: "b9929547-5069-4ebb-985e-18afea3f0641"). InnerVolumeSpecName "kube-api-access-knr7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.037826 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b8f2b3-b127-40ab-925b-4bec30c47198-kube-api-access-7zhtf" (OuterVolumeSpecName: "kube-api-access-7zhtf") pod "49b8f2b3-b127-40ab-925b-4bec30c47198" (UID: "49b8f2b3-b127-40ab-925b-4bec30c47198"). InnerVolumeSpecName "kube-api-access-7zhtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.125555 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.132381 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b8f2b3-b127-40ab-925b-4bec30c47198-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.132406 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knr7c\" (UniqueName: \"kubernetes.io/projected/b9929547-5069-4ebb-985e-18afea3f0641-kube-api-access-knr7c\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.132417 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9929547-5069-4ebb-985e-18afea3f0641-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.132427 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zhtf\" (UniqueName: \"kubernetes.io/projected/49b8f2b3-b127-40ab-925b-4bec30c47198-kube-api-access-7zhtf\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.233482 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4hdq\" (UniqueName: \"kubernetes.io/projected/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-kube-api-access-h4hdq\") pod \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.233606 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-logs\") pod \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.233639 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-config-data\") pod \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\" (UID: \"8c7aaf72-23a2-4399-af3c-2997d9a88e0e\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.234572 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-logs" (OuterVolumeSpecName: "logs") pod "8c7aaf72-23a2-4399-af3c-2997d9a88e0e" (UID: "8c7aaf72-23a2-4399-af3c-2997d9a88e0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.237149 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-kube-api-access-h4hdq" (OuterVolumeSpecName: "kube-api-access-h4hdq") pod "8c7aaf72-23a2-4399-af3c-2997d9a88e0e" (UID: "8c7aaf72-23a2-4399-af3c-2997d9a88e0e"). InnerVolumeSpecName "kube-api-access-h4hdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.255445 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-config-data" (OuterVolumeSpecName: "config-data") pod "8c7aaf72-23a2-4399-af3c-2997d9a88e0e" (UID: "8c7aaf72-23a2-4399-af3c-2997d9a88e0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.285518 4842 generic.go:334] "Generic (PLEG): container finished" podID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerID="9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215" exitCode=0 Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.285595 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8c7aaf72-23a2-4399-af3c-2997d9a88e0e","Type":"ContainerDied","Data":"9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.285628 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"8c7aaf72-23a2-4399-af3c-2997d9a88e0e","Type":"ContainerDied","Data":"576d6b1b313ca26ae4fc90296d46f54a54333ca72c161a96a6b39a15c7c6a071"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.285628 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.285646 4842 scope.go:117] "RemoveContainer" containerID="9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.296305 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" event={"ID":"b66ea655-2e59-4c9c-a0a3-710d38cab666","Type":"ContainerDied","Data":"6ccb635ad7f3cdd59c610618af9e2ac4ec64748e7a7ad86ab86ff45552e3310f"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.296347 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ccb635ad7f3cdd59c610618af9e2ac4ec64748e7a7ad86ab86ff45552e3310f" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.296410 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell1882e-account-delete-v24zl" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.298599 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" event={"ID":"49b8f2b3-b127-40ab-925b-4bec30c47198","Type":"ContainerDied","Data":"c9f00e72cb149e70f235d30422b8e719c52e4dfe87b1d991fb3fd209a7ab60d7"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.298621 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9f00e72cb149e70f235d30422b8e719c52e4dfe87b1d991fb3fd209a7ab60d7" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.298654 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapib1de-account-delete-fqcgn" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.300230 4842 generic.go:334] "Generic (PLEG): container finished" podID="d25f746b-58f8-48ec-8f73-f367a6148143" containerID="bb3cf014b033909853f63dc9436857d7962c07b4e9e816dea85324527c2bd4f5" exitCode=0 Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.300342 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"d25f746b-58f8-48ec-8f73-f367a6148143","Type":"ContainerDied","Data":"bb3cf014b033909853f63dc9436857d7962c07b4e9e816dea85324527c2bd4f5"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.300365 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"d25f746b-58f8-48ec-8f73-f367a6148143","Type":"ContainerDied","Data":"8958b884638366e3729a849485d6786b8c16548ec71f9c9cf16e3b0350967b27"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.300377 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8958b884638366e3729a849485d6786b8c16548ec71f9c9cf16e3b0350967b27" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.303652 4842 generic.go:334] "Generic (PLEG): container finished" podID="e62ac3a6-d61b-47b0-a91f-0772398f3ddc" containerID="18240a5aba4e417e66b11f8e254e8b82251caf43c2f4b8a02303a313fe3c2828" exitCode=0 Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.303677 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"e62ac3a6-d61b-47b0-a91f-0772398f3ddc","Type":"ContainerDied","Data":"18240a5aba4e417e66b11f8e254e8b82251caf43c2f4b8a02303a313fe3c2828"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.303701 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"e62ac3a6-d61b-47b0-a91f-0772398f3ddc","Type":"ContainerDied","Data":"a40a0f4ff475c2a5e381e52c5cbb8602064ed87d25ffc8bfdb58b64f5441ee92"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.303712 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a40a0f4ff475c2a5e381e52c5cbb8602064ed87d25ffc8bfdb58b64f5441ee92" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.306132 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" event={"ID":"b9929547-5069-4ebb-985e-18afea3f0641","Type":"ContainerDied","Data":"57a98fb69a56cb9cae10468293909c6609332607e8db5b4ebcc431eceefec5d0"} Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.306169 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell00a69-account-delete-dsnhz" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.306169 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57a98fb69a56cb9cae10468293909c6609332607e8db5b4ebcc431eceefec5d0" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.312241 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.320162 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.336758 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.336825 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.336839 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4hdq\" (UniqueName: \"kubernetes.io/projected/8c7aaf72-23a2-4399-af3c-2997d9a88e0e-kube-api-access-h4hdq\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.359320 4842 scope.go:117] "RemoveContainer" containerID="daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.367262 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.373488 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.386475 4842 scope.go:117] "RemoveContainer" containerID="9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215" Mar 11 19:20:47 crc kubenswrapper[4842]: E0311 19:20:47.387078 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215\": container with ID starting with 9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215 not found: ID does not exist" containerID="9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.387136 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215"} err="failed to get container status \"9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215\": rpc error: code = NotFound desc = could not find container \"9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215\": container with ID starting with 9a7e093a796af7d4d39313f06f7c50236008c681243ffbee0edfac1f9f1b7215 not found: ID does not exist" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.387169 4842 scope.go:117] "RemoveContainer" containerID="daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea" Mar 11 19:20:47 crc kubenswrapper[4842]: E0311 19:20:47.387710 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea\": container with ID starting with daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea not found: ID does not exist" containerID="daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.387751 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea"} err="failed to get container status \"daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea\": rpc error: code = NotFound desc = could not find container \"daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea\": container with ID starting with daaee1e6c2eda7905e7d7482b48aa5429768a81088981e616c1b353ee8e282ea not found: ID does not exist" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.438938 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-config-data\") pod \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.439062 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq28m\" (UniqueName: \"kubernetes.io/projected/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-kube-api-access-sq28m\") pod \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\" (UID: \"e62ac3a6-d61b-47b0-a91f-0772398f3ddc\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.439104 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25f746b-58f8-48ec-8f73-f367a6148143-config-data\") pod \"d25f746b-58f8-48ec-8f73-f367a6148143\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.439140 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xshk\" (UniqueName: \"kubernetes.io/projected/d25f746b-58f8-48ec-8f73-f367a6148143-kube-api-access-2xshk\") pod \"d25f746b-58f8-48ec-8f73-f367a6148143\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.439252 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d25f746b-58f8-48ec-8f73-f367a6148143-logs\") pod \"d25f746b-58f8-48ec-8f73-f367a6148143\" (UID: \"d25f746b-58f8-48ec-8f73-f367a6148143\") " Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.439771 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d25f746b-58f8-48ec-8f73-f367a6148143-logs" (OuterVolumeSpecName: "logs") pod "d25f746b-58f8-48ec-8f73-f367a6148143" (UID: "d25f746b-58f8-48ec-8f73-f367a6148143"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.442147 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-kube-api-access-sq28m" (OuterVolumeSpecName: "kube-api-access-sq28m") pod "e62ac3a6-d61b-47b0-a91f-0772398f3ddc" (UID: "e62ac3a6-d61b-47b0-a91f-0772398f3ddc"). InnerVolumeSpecName "kube-api-access-sq28m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.455767 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d25f746b-58f8-48ec-8f73-f367a6148143-kube-api-access-2xshk" (OuterVolumeSpecName: "kube-api-access-2xshk") pod "d25f746b-58f8-48ec-8f73-f367a6148143" (UID: "d25f746b-58f8-48ec-8f73-f367a6148143"). InnerVolumeSpecName "kube-api-access-2xshk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.459608 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-config-data" (OuterVolumeSpecName: "config-data") pod "e62ac3a6-d61b-47b0-a91f-0772398f3ddc" (UID: "e62ac3a6-d61b-47b0-a91f-0772398f3ddc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.460379 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d25f746b-58f8-48ec-8f73-f367a6148143-config-data" (OuterVolumeSpecName: "config-data") pod "d25f746b-58f8-48ec-8f73-f367a6148143" (UID: "d25f746b-58f8-48ec-8f73-f367a6148143"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.541782 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq28m\" (UniqueName: \"kubernetes.io/projected/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-kube-api-access-sq28m\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.541850 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d25f746b-58f8-48ec-8f73-f367a6148143-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.541869 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xshk\" (UniqueName: \"kubernetes.io/projected/d25f746b-58f8-48ec-8f73-f367a6148143-kube-api-access-2xshk\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.541891 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d25f746b-58f8-48ec-8f73-f367a6148143-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.541910 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e62ac3a6-d61b-47b0-a91f-0772398f3ddc-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.962032 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-db-create-5cj2c"] Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.975051 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-db-create-5cj2c"] Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.987014 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novaapib1de-account-delete-fqcgn"] Mar 11 19:20:47 crc kubenswrapper[4842]: I0311 19:20:47.998600 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-b1de-account-create-update-tdww2"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.004646 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-b1de-account-create-update-tdww2"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.011197 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novaapib1de-account-delete-fqcgn"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.055854 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gttk6"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.067250 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-gttk6"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.082093 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.087520 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell00a69-account-delete-dsnhz"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.094483 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-0a69-account-create-update-gp59s"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.099674 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell00a69-account-delete-dsnhz"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.153416 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-p2qrl"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.160324 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-p2qrl"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.170433 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.176455 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell1882e-account-delete-v24zl"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.187261 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-882e-account-create-update-vgnhw"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.202893 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell1882e-account-delete-v24zl"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.315737 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.323375 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.359452 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.371250 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.380164 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.388816 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.978946 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1550b18b-bd82-4f4e-a758-418ebc45100e" path="/var/lib/kubelet/pods/1550b18b-bd82-4f4e-a758-418ebc45100e/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.980229 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3099459d-b582-437c-85a8-2e10562224c9" path="/var/lib/kubelet/pods/3099459d-b582-437c-85a8-2e10562224c9/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.981339 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dddeff-d94c-4d7f-978a-d6930ee4d555" path="/var/lib/kubelet/pods/32dddeff-d94c-4d7f-978a-d6930ee4d555/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.982390 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b0dd88-16ef-4dc6-9f3b-a276fd87c154" path="/var/lib/kubelet/pods/41b0dd88-16ef-4dc6-9f3b-a276fd87c154/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.984334 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b8f2b3-b127-40ab-925b-4bec30c47198" path="/var/lib/kubelet/pods/49b8f2b3-b127-40ab-925b-4bec30c47198/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.985385 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c323f17-b38b-4762-b699-0f53719ebe74" path="/var/lib/kubelet/pods/7c323f17-b38b-4762-b699-0f53719ebe74/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.986749 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" path="/var/lib/kubelet/pods/8c7aaf72-23a2-4399-af3c-2997d9a88e0e/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.989324 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4930d1d-6a6a-411f-b0d8-5efb91c732f3" path="/var/lib/kubelet/pods/b4930d1d-6a6a-411f-b0d8-5efb91c732f3/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.990526 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66ea655-2e59-4c9c-a0a3-710d38cab666" path="/var/lib/kubelet/pods/b66ea655-2e59-4c9c-a0a3-710d38cab666/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.991466 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9929547-5069-4ebb-985e-18afea3f0641" path="/var/lib/kubelet/pods/b9929547-5069-4ebb-985e-18afea3f0641/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.993433 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" path="/var/lib/kubelet/pods/d25f746b-58f8-48ec-8f73-f367a6148143/volumes" Mar 11 19:20:48 crc kubenswrapper[4842]: I0311 19:20:48.994473 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e62ac3a6-d61b-47b0-a91f-0772398f3ddc" path="/var/lib/kubelet/pods/e62ac3a6-d61b-47b0-a91f-0772398f3ddc/volumes" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.168572 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-db-create-nnvg2"] Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169244 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-metadata" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169282 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-metadata" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169306 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9929547-5069-4ebb-985e-18afea3f0641" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169315 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9929547-5069-4ebb-985e-18afea3f0641" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169335 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169346 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169370 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434157e1-507f-4acd-b1ad-b8c8add9d863" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169379 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="434157e1-507f-4acd-b1ad-b8c8add9d863" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169388 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-log" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169395 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-log" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169412 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-log" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169419 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-log" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169430 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169438 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169458 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-api" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169466 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-api" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169478 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b8f2b3-b127-40ab-925b-4bec30c47198" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169486 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b8f2b3-b127-40ab-925b-4bec30c47198" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169499 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62ac3a6-d61b-47b0-a91f-0772398f3ddc" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169507 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62ac3a6-d61b-47b0-a91f-0772398f3ddc" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:20:50 crc kubenswrapper[4842]: E0311 19:20:50.169521 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66ea655-2e59-4c9c-a0a3-710d38cab666" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169529 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66ea655-2e59-4c9c-a0a3-710d38cab666" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169701 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1823b2-7f89-47ff-936d-1b2ad52f5d7a" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169724 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e62ac3a6-d61b-47b0-a91f-0772398f3ddc" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169740 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b8f2b3-b127-40ab-925b-4bec30c47198" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169749 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-metadata" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169759 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66ea655-2e59-4c9c-a0a3-710d38cab666" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169773 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-log" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169786 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="434157e1-507f-4acd-b1ad-b8c8add9d863" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169798 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7aaf72-23a2-4399-af3c-2997d9a88e0e" containerName="nova-kuttl-metadata-log" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169812 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d067782b-b6da-4dcc-a0a9-0ecbdcfcf142" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169824 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d25f746b-58f8-48ec-8f73-f367a6148143" containerName="nova-kuttl-api-api" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.169831 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9929547-5069-4ebb-985e-18afea3f0641" containerName="mariadb-account-delete" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.170436 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.183925 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-nnvg2"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.260858 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-b7dsg"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.263067 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.276939 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-b7dsg"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.284499 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1580830f-2870-436d-b982-a1775fc494bb-operator-scripts\") pod \"nova-api-db-create-nnvg2\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.284565 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxgsp\" (UniqueName: \"kubernetes.io/projected/1580830f-2870-436d-b982-a1775fc494bb-kube-api-access-lxgsp\") pod \"nova-api-db-create-nnvg2\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.370165 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-76zdk"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.371226 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.385763 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhqmb\" (UniqueName: \"kubernetes.io/projected/4f67413c-94d7-4948-aec7-086827349cc6-kube-api-access-zhqmb\") pod \"nova-cell0-db-create-b7dsg\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.386490 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1580830f-2870-436d-b982-a1775fc494bb-operator-scripts\") pod \"nova-api-db-create-nnvg2\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.386580 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxgsp\" (UniqueName: \"kubernetes.io/projected/1580830f-2870-436d-b982-a1775fc494bb-kube-api-access-lxgsp\") pod \"nova-api-db-create-nnvg2\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.386663 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f67413c-94d7-4948-aec7-086827349cc6-operator-scripts\") pod \"nova-cell0-db-create-b7dsg\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.388086 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1580830f-2870-436d-b982-a1775fc494bb-operator-scripts\") pod \"nova-api-db-create-nnvg2\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.389946 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-api-d894-account-create-update-m4xhn"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.391314 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.395553 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-api-db-secret" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.413059 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxgsp\" (UniqueName: \"kubernetes.io/projected/1580830f-2870-436d-b982-a1775fc494bb-kube-api-access-lxgsp\") pod \"nova-api-db-create-nnvg2\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.418134 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-76zdk"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.443307 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-d894-account-create-update-m4xhn"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.488168 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.489105 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f67413c-94d7-4948-aec7-086827349cc6-operator-scripts\") pod \"nova-cell0-db-create-b7dsg\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.490044 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f67413c-94d7-4948-aec7-086827349cc6-operator-scripts\") pod \"nova-cell0-db-create-b7dsg\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.493289 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-operator-scripts\") pod \"nova-cell1-db-create-76zdk\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.493504 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhqmb\" (UniqueName: \"kubernetes.io/projected/4f67413c-94d7-4948-aec7-086827349cc6-kube-api-access-zhqmb\") pod \"nova-cell0-db-create-b7dsg\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.493702 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wz9z\" (UniqueName: \"kubernetes.io/projected/6134a356-636f-4379-9bd1-86db49454ca5-kube-api-access-8wz9z\") pod \"nova-api-d894-account-create-update-m4xhn\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.493746 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rl8\" (UniqueName: \"kubernetes.io/projected/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-kube-api-access-f2rl8\") pod \"nova-cell1-db-create-76zdk\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.493793 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6134a356-636f-4379-9bd1-86db49454ca5-operator-scripts\") pod \"nova-api-d894-account-create-update-m4xhn\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.516301 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhqmb\" (UniqueName: \"kubernetes.io/projected/4f67413c-94d7-4948-aec7-086827349cc6-kube-api-access-zhqmb\") pod \"nova-cell0-db-create-b7dsg\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.575531 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.576695 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.579840 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell0-db-secret" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.589196 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.589815 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.597358 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wz9z\" (UniqueName: \"kubernetes.io/projected/6134a356-636f-4379-9bd1-86db49454ca5-kube-api-access-8wz9z\") pod \"nova-api-d894-account-create-update-m4xhn\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.597416 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rl8\" (UniqueName: \"kubernetes.io/projected/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-kube-api-access-f2rl8\") pod \"nova-cell1-db-create-76zdk\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.597450 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6134a356-636f-4379-9bd1-86db49454ca5-operator-scripts\") pod \"nova-api-d894-account-create-update-m4xhn\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.597532 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-operator-scripts\") pod \"nova-cell1-db-create-76zdk\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.598214 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-operator-scripts\") pod \"nova-cell1-db-create-76zdk\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.599866 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6134a356-636f-4379-9bd1-86db49454ca5-operator-scripts\") pod \"nova-api-d894-account-create-update-m4xhn\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.633219 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rl8\" (UniqueName: \"kubernetes.io/projected/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-kube-api-access-f2rl8\") pod \"nova-cell1-db-create-76zdk\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.633266 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wz9z\" (UniqueName: \"kubernetes.io/projected/6134a356-636f-4379-9bd1-86db49454ca5-kube-api-access-8wz9z\") pod \"nova-api-d894-account-create-update-m4xhn\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.687405 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.699417 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gl7\" (UniqueName: \"kubernetes.io/projected/23b02d76-eca4-492a-b9c7-29a77627d816-kube-api-access-99gl7\") pod \"nova-cell0-187d-account-create-update-lbfmn\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.699464 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b02d76-eca4-492a-b9c7-29a77627d816-operator-scripts\") pod \"nova-cell0-187d-account-create-update-lbfmn\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.716601 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.774946 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.776029 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.778202 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-cell1-db-secret" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.784138 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw"] Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.801887 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99gl7\" (UniqueName: \"kubernetes.io/projected/23b02d76-eca4-492a-b9c7-29a77627d816-kube-api-access-99gl7\") pod \"nova-cell0-187d-account-create-update-lbfmn\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.801927 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b02d76-eca4-492a-b9c7-29a77627d816-operator-scripts\") pod \"nova-cell0-187d-account-create-update-lbfmn\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.802670 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b02d76-eca4-492a-b9c7-29a77627d816-operator-scripts\") pod \"nova-cell0-187d-account-create-update-lbfmn\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.823791 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gl7\" (UniqueName: \"kubernetes.io/projected/23b02d76-eca4-492a-b9c7-29a77627d816-kube-api-access-99gl7\") pod \"nova-cell0-187d-account-create-update-lbfmn\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.906184 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ec451f-d841-42fc-a6ab-3d81f62be3df-operator-scripts\") pod \"nova-cell1-73b3-account-create-update-sj8dw\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.906405 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jv4b\" (UniqueName: \"kubernetes.io/projected/28ec451f-d841-42fc-a6ab-3d81f62be3df-kube-api-access-6jv4b\") pod \"nova-cell1-73b3-account-create-update-sj8dw\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.906664 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:50 crc kubenswrapper[4842]: I0311 19:20:50.953447 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-db-create-nnvg2"] Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.009098 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jv4b\" (UniqueName: \"kubernetes.io/projected/28ec451f-d841-42fc-a6ab-3d81f62be3df-kube-api-access-6jv4b\") pod \"nova-cell1-73b3-account-create-update-sj8dw\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.009169 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ec451f-d841-42fc-a6ab-3d81f62be3df-operator-scripts\") pod \"nova-cell1-73b3-account-create-update-sj8dw\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.010134 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ec451f-d841-42fc-a6ab-3d81f62be3df-operator-scripts\") pod \"nova-cell1-73b3-account-create-update-sj8dw\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.027939 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jv4b\" (UniqueName: \"kubernetes.io/projected/28ec451f-d841-42fc-a6ab-3d81f62be3df-kube-api-access-6jv4b\") pod \"nova-cell1-73b3-account-create-update-sj8dw\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.102154 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.109696 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-b7dsg"] Mar 11 19:20:51 crc kubenswrapper[4842]: W0311 19:20:51.137286 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f67413c_94d7_4948_aec7_086827349cc6.slice/crio-f4e3af570901b108bdbeaf1ba388aa9babed8778608b8e4f765d8c0eed9fce95 WatchSource:0}: Error finding container f4e3af570901b108bdbeaf1ba388aa9babed8778608b8e4f765d8c0eed9fce95: Status 404 returned error can't find the container with id f4e3af570901b108bdbeaf1ba388aa9babed8778608b8e4f765d8c0eed9fce95 Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.229257 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-api-d894-account-create-update-m4xhn"] Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.235049 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-76zdk"] Mar 11 19:20:51 crc kubenswrapper[4842]: W0311 19:20:51.237841 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6134a356_636f_4379_9bd1_86db49454ca5.slice/crio-1338fd08bf9267c6fdf18d9fdc85eda22b9e99c7ddf9b191877566e4d93e36f2 WatchSource:0}: Error finding container 1338fd08bf9267c6fdf18d9fdc85eda22b9e99c7ddf9b191877566e4d93e36f2: Status 404 returned error can't find the container with id 1338fd08bf9267c6fdf18d9fdc85eda22b9e99c7ddf9b191877566e4d93e36f2 Mar 11 19:20:51 crc kubenswrapper[4842]: W0311 19:20:51.260569 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0054e5b3_ea14_46c7_8742_8f9c9ff9a705.slice/crio-4895aeeb65c038ad0f8bf336d734691e6cedc945275ff29eb061a367ac261ac3 WatchSource:0}: Error finding container 4895aeeb65c038ad0f8bf336d734691e6cedc945275ff29eb061a367ac261ac3: Status 404 returned error can't find the container with id 4895aeeb65c038ad0f8bf336d734691e6cedc945275ff29eb061a367ac261ac3 Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.368515 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn"] Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.380140 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" event={"ID":"4f67413c-94d7-4948-aec7-086827349cc6","Type":"ContainerStarted","Data":"11074cf1586d8b964f3ac2b11c803178ea049b5f62ab260fb9b9a5f809f82e76"} Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.380178 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" event={"ID":"4f67413c-94d7-4948-aec7-086827349cc6","Type":"ContainerStarted","Data":"f4e3af570901b108bdbeaf1ba388aa9babed8778608b8e4f765d8c0eed9fce95"} Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.383800 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" event={"ID":"6134a356-636f-4379-9bd1-86db49454ca5","Type":"ContainerStarted","Data":"1338fd08bf9267c6fdf18d9fdc85eda22b9e99c7ddf9b191877566e4d93e36f2"} Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.385322 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-76zdk" event={"ID":"0054e5b3-ea14-46c7-8742-8f9c9ff9a705","Type":"ContainerStarted","Data":"4895aeeb65c038ad0f8bf336d734691e6cedc945275ff29eb061a367ac261ac3"} Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.387208 4842 generic.go:334] "Generic (PLEG): container finished" podID="1580830f-2870-436d-b982-a1775fc494bb" containerID="58fe376c8f9e7c1a2e40edc01459579c9d7b6dac4451e7e1b13657bc47b1870a" exitCode=0 Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.387237 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-nnvg2" event={"ID":"1580830f-2870-436d-b982-a1775fc494bb","Type":"ContainerDied","Data":"58fe376c8f9e7c1a2e40edc01459579c9d7b6dac4451e7e1b13657bc47b1870a"} Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.387254 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-nnvg2" event={"ID":"1580830f-2870-436d-b982-a1775fc494bb","Type":"ContainerStarted","Data":"a4bf4db9551d10e68019971df7522c63d439700ef60e34ba416286e39789e749"} Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.397700 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" podStartSLOduration=1.397686092 podStartE2EDuration="1.397686092s" podCreationTimestamp="2026-03-11 19:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:51.394929319 +0000 UTC m=+1897.042625609" watchObservedRunningTime="2026-03-11 19:20:51.397686092 +0000 UTC m=+1897.045382372" Mar 11 19:20:51 crc kubenswrapper[4842]: I0311 19:20:51.615836 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw"] Mar 11 19:20:51 crc kubenswrapper[4842]: W0311 19:20:51.653713 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28ec451f_d841_42fc_a6ab_3d81f62be3df.slice/crio-9f639d2177042683c2915ea07da30724621ce0416f382812e55c2e98353f2f78 WatchSource:0}: Error finding container 9f639d2177042683c2915ea07da30724621ce0416f382812e55c2e98353f2f78: Status 404 returned error can't find the container with id 9f639d2177042683c2915ea07da30724621ce0416f382812e55c2e98353f2f78 Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.406639 4842 generic.go:334] "Generic (PLEG): container finished" podID="4f67413c-94d7-4948-aec7-086827349cc6" containerID="11074cf1586d8b964f3ac2b11c803178ea049b5f62ab260fb9b9a5f809f82e76" exitCode=0 Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.406720 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" event={"ID":"4f67413c-94d7-4948-aec7-086827349cc6","Type":"ContainerDied","Data":"11074cf1586d8b964f3ac2b11c803178ea049b5f62ab260fb9b9a5f809f82e76"} Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.410786 4842 generic.go:334] "Generic (PLEG): container finished" podID="28ec451f-d841-42fc-a6ab-3d81f62be3df" containerID="25fbed0be93ebd493774e753f783153d64120d27892582489799a861686f1961" exitCode=0 Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.411036 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" event={"ID":"28ec451f-d841-42fc-a6ab-3d81f62be3df","Type":"ContainerDied","Data":"25fbed0be93ebd493774e753f783153d64120d27892582489799a861686f1961"} Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.411327 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" event={"ID":"28ec451f-d841-42fc-a6ab-3d81f62be3df","Type":"ContainerStarted","Data":"9f639d2177042683c2915ea07da30724621ce0416f382812e55c2e98353f2f78"} Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.413223 4842 generic.go:334] "Generic (PLEG): container finished" podID="6134a356-636f-4379-9bd1-86db49454ca5" containerID="23cad88cbc670034dc3aca5f656c2fea9ef40add76b7ea8735ee4d3685011e2a" exitCode=0 Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.413372 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" event={"ID":"6134a356-636f-4379-9bd1-86db49454ca5","Type":"ContainerDied","Data":"23cad88cbc670034dc3aca5f656c2fea9ef40add76b7ea8735ee4d3685011e2a"} Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.417735 4842 generic.go:334] "Generic (PLEG): container finished" podID="0054e5b3-ea14-46c7-8742-8f9c9ff9a705" containerID="f499c78a599c8049e6297f3303f43ba95723052b5b6cb10e70972d546699c221" exitCode=0 Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.417892 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-76zdk" event={"ID":"0054e5b3-ea14-46c7-8742-8f9c9ff9a705","Type":"ContainerDied","Data":"f499c78a599c8049e6297f3303f43ba95723052b5b6cb10e70972d546699c221"} Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.420628 4842 generic.go:334] "Generic (PLEG): container finished" podID="23b02d76-eca4-492a-b9c7-29a77627d816" containerID="0f845aa1e8a92d4c660143a765bd14377a89070cdc38649a8c44d216bfbc017e" exitCode=0 Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.421013 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" event={"ID":"23b02d76-eca4-492a-b9c7-29a77627d816","Type":"ContainerDied","Data":"0f845aa1e8a92d4c660143a765bd14377a89070cdc38649a8c44d216bfbc017e"} Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.421068 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" event={"ID":"23b02d76-eca4-492a-b9c7-29a77627d816","Type":"ContainerStarted","Data":"a39b0f031b3a46ad2f6828ea5d42fa931db7b8d8f74c2b58eaaa714ac584e628"} Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.881729 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.939468 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxgsp\" (UniqueName: \"kubernetes.io/projected/1580830f-2870-436d-b982-a1775fc494bb-kube-api-access-lxgsp\") pod \"1580830f-2870-436d-b982-a1775fc494bb\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.939527 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1580830f-2870-436d-b982-a1775fc494bb-operator-scripts\") pod \"1580830f-2870-436d-b982-a1775fc494bb\" (UID: \"1580830f-2870-436d-b982-a1775fc494bb\") " Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.940394 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1580830f-2870-436d-b982-a1775fc494bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1580830f-2870-436d-b982-a1775fc494bb" (UID: "1580830f-2870-436d-b982-a1775fc494bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:52 crc kubenswrapper[4842]: I0311 19:20:52.945355 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1580830f-2870-436d-b982-a1775fc494bb-kube-api-access-lxgsp" (OuterVolumeSpecName: "kube-api-access-lxgsp") pod "1580830f-2870-436d-b982-a1775fc494bb" (UID: "1580830f-2870-436d-b982-a1775fc494bb"). InnerVolumeSpecName "kube-api-access-lxgsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.041137 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxgsp\" (UniqueName: \"kubernetes.io/projected/1580830f-2870-436d-b982-a1775fc494bb-kube-api-access-lxgsp\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.041180 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1580830f-2870-436d-b982-a1775fc494bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.432894 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-db-create-nnvg2" event={"ID":"1580830f-2870-436d-b982-a1775fc494bb","Type":"ContainerDied","Data":"a4bf4db9551d10e68019971df7522c63d439700ef60e34ba416286e39789e749"} Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.433068 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4bf4db9551d10e68019971df7522c63d439700ef60e34ba416286e39789e749" Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.433071 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-db-create-nnvg2" Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.806466 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.957798 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b02d76-eca4-492a-b9c7-29a77627d816-operator-scripts\") pod \"23b02d76-eca4-492a-b9c7-29a77627d816\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.958000 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99gl7\" (UniqueName: \"kubernetes.io/projected/23b02d76-eca4-492a-b9c7-29a77627d816-kube-api-access-99gl7\") pod \"23b02d76-eca4-492a-b9c7-29a77627d816\" (UID: \"23b02d76-eca4-492a-b9c7-29a77627d816\") " Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.958905 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b02d76-eca4-492a-b9c7-29a77627d816-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23b02d76-eca4-492a-b9c7-29a77627d816" (UID: "23b02d76-eca4-492a-b9c7-29a77627d816"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:53 crc kubenswrapper[4842]: I0311 19:20:53.964420 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b02d76-eca4-492a-b9c7-29a77627d816-kube-api-access-99gl7" (OuterVolumeSpecName: "kube-api-access-99gl7") pod "23b02d76-eca4-492a-b9c7-29a77627d816" (UID: "23b02d76-eca4-492a-b9c7-29a77627d816"). InnerVolumeSpecName "kube-api-access-99gl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.020420 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.026505 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.035052 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.040192 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.069177 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b02d76-eca4-492a-b9c7-29a77627d816-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.069219 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99gl7\" (UniqueName: \"kubernetes.io/projected/23b02d76-eca4-492a-b9c7-29a77627d816-kube-api-access-99gl7\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.170597 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6134a356-636f-4379-9bd1-86db49454ca5-operator-scripts\") pod \"6134a356-636f-4379-9bd1-86db49454ca5\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.170873 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wz9z\" (UniqueName: \"kubernetes.io/projected/6134a356-636f-4379-9bd1-86db49454ca5-kube-api-access-8wz9z\") pod \"6134a356-636f-4379-9bd1-86db49454ca5\" (UID: \"6134a356-636f-4379-9bd1-86db49454ca5\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.170951 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhqmb\" (UniqueName: \"kubernetes.io/projected/4f67413c-94d7-4948-aec7-086827349cc6-kube-api-access-zhqmb\") pod \"4f67413c-94d7-4948-aec7-086827349cc6\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171078 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ec451f-d841-42fc-a6ab-3d81f62be3df-operator-scripts\") pod \"28ec451f-d841-42fc-a6ab-3d81f62be3df\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171123 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6134a356-636f-4379-9bd1-86db49454ca5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6134a356-636f-4379-9bd1-86db49454ca5" (UID: "6134a356-636f-4379-9bd1-86db49454ca5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171240 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f67413c-94d7-4948-aec7-086827349cc6-operator-scripts\") pod \"4f67413c-94d7-4948-aec7-086827349cc6\" (UID: \"4f67413c-94d7-4948-aec7-086827349cc6\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171355 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jv4b\" (UniqueName: \"kubernetes.io/projected/28ec451f-d841-42fc-a6ab-3d81f62be3df-kube-api-access-6jv4b\") pod \"28ec451f-d841-42fc-a6ab-3d81f62be3df\" (UID: \"28ec451f-d841-42fc-a6ab-3d81f62be3df\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171457 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2rl8\" (UniqueName: \"kubernetes.io/projected/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-kube-api-access-f2rl8\") pod \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171522 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ec451f-d841-42fc-a6ab-3d81f62be3df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28ec451f-d841-42fc-a6ab-3d81f62be3df" (UID: "28ec451f-d841-42fc-a6ab-3d81f62be3df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171532 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-operator-scripts\") pod \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\" (UID: \"0054e5b3-ea14-46c7-8742-8f9c9ff9a705\") " Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.171618 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f67413c-94d7-4948-aec7-086827349cc6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f67413c-94d7-4948-aec7-086827349cc6" (UID: "4f67413c-94d7-4948-aec7-086827349cc6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.172073 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0054e5b3-ea14-46c7-8742-8f9c9ff9a705" (UID: "0054e5b3-ea14-46c7-8742-8f9c9ff9a705"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.172188 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6134a356-636f-4379-9bd1-86db49454ca5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.172208 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ec451f-d841-42fc-a6ab-3d81f62be3df-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.172217 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f67413c-94d7-4948-aec7-086827349cc6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.174072 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f67413c-94d7-4948-aec7-086827349cc6-kube-api-access-zhqmb" (OuterVolumeSpecName: "kube-api-access-zhqmb") pod "4f67413c-94d7-4948-aec7-086827349cc6" (UID: "4f67413c-94d7-4948-aec7-086827349cc6"). InnerVolumeSpecName "kube-api-access-zhqmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.174649 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ec451f-d841-42fc-a6ab-3d81f62be3df-kube-api-access-6jv4b" (OuterVolumeSpecName: "kube-api-access-6jv4b") pod "28ec451f-d841-42fc-a6ab-3d81f62be3df" (UID: "28ec451f-d841-42fc-a6ab-3d81f62be3df"). InnerVolumeSpecName "kube-api-access-6jv4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.180366 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6134a356-636f-4379-9bd1-86db49454ca5-kube-api-access-8wz9z" (OuterVolumeSpecName: "kube-api-access-8wz9z") pod "6134a356-636f-4379-9bd1-86db49454ca5" (UID: "6134a356-636f-4379-9bd1-86db49454ca5"). InnerVolumeSpecName "kube-api-access-8wz9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.180774 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-kube-api-access-f2rl8" (OuterVolumeSpecName: "kube-api-access-f2rl8") pod "0054e5b3-ea14-46c7-8742-8f9c9ff9a705" (UID: "0054e5b3-ea14-46c7-8742-8f9c9ff9a705"). InnerVolumeSpecName "kube-api-access-f2rl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.273621 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wz9z\" (UniqueName: \"kubernetes.io/projected/6134a356-636f-4379-9bd1-86db49454ca5-kube-api-access-8wz9z\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.273652 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhqmb\" (UniqueName: \"kubernetes.io/projected/4f67413c-94d7-4948-aec7-086827349cc6-kube-api-access-zhqmb\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.273678 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jv4b\" (UniqueName: \"kubernetes.io/projected/28ec451f-d841-42fc-a6ab-3d81f62be3df-kube-api-access-6jv4b\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.273687 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2rl8\" (UniqueName: \"kubernetes.io/projected/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-kube-api-access-f2rl8\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.273698 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0054e5b3-ea14-46c7-8742-8f9c9ff9a705-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.446261 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" event={"ID":"6134a356-636f-4379-9bd1-86db49454ca5","Type":"ContainerDied","Data":"1338fd08bf9267c6fdf18d9fdc85eda22b9e99c7ddf9b191877566e4d93e36f2"} Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.446324 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1338fd08bf9267c6fdf18d9fdc85eda22b9e99c7ddf9b191877566e4d93e36f2" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.446393 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-api-d894-account-create-update-m4xhn" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.449120 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-db-create-76zdk" event={"ID":"0054e5b3-ea14-46c7-8742-8f9c9ff9a705","Type":"ContainerDied","Data":"4895aeeb65c038ad0f8bf336d734691e6cedc945275ff29eb061a367ac261ac3"} Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.449151 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4895aeeb65c038ad0f8bf336d734691e6cedc945275ff29eb061a367ac261ac3" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.449199 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-db-create-76zdk" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.457855 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.457904 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn" event={"ID":"23b02d76-eca4-492a-b9c7-29a77627d816","Type":"ContainerDied","Data":"a39b0f031b3a46ad2f6828ea5d42fa931db7b8d8f74c2b58eaaa714ac584e628"} Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.458040 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a39b0f031b3a46ad2f6828ea5d42fa931db7b8d8f74c2b58eaaa714ac584e628" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.462570 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" event={"ID":"4f67413c-94d7-4948-aec7-086827349cc6","Type":"ContainerDied","Data":"f4e3af570901b108bdbeaf1ba388aa9babed8778608b8e4f765d8c0eed9fce95"} Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.462606 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4e3af570901b108bdbeaf1ba388aa9babed8778608b8e4f765d8c0eed9fce95" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.462615 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell0-db-create-b7dsg" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.464822 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" event={"ID":"28ec451f-d841-42fc-a6ab-3d81f62be3df","Type":"ContainerDied","Data":"9f639d2177042683c2915ea07da30724621ce0416f382812e55c2e98353f2f78"} Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.464861 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f639d2177042683c2915ea07da30724621ce0416f382812e55c2e98353f2f78" Mar 11 19:20:54 crc kubenswrapper[4842]: I0311 19:20:54.465012 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.825717 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4"] Mar 11 19:20:55 crc kubenswrapper[4842]: E0311 19:20:55.826816 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1580830f-2870-436d-b982-a1775fc494bb" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.826936 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="1580830f-2870-436d-b982-a1775fc494bb" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: E0311 19:20:55.827042 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ec451f-d841-42fc-a6ab-3d81f62be3df" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.827108 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ec451f-d841-42fc-a6ab-3d81f62be3df" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: E0311 19:20:55.827188 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6134a356-636f-4379-9bd1-86db49454ca5" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.827256 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6134a356-636f-4379-9bd1-86db49454ca5" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: E0311 19:20:55.827349 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0054e5b3-ea14-46c7-8742-8f9c9ff9a705" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.827434 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="0054e5b3-ea14-46c7-8742-8f9c9ff9a705" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: E0311 19:20:55.827504 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b02d76-eca4-492a-b9c7-29a77627d816" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.827570 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b02d76-eca4-492a-b9c7-29a77627d816" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: E0311 19:20:55.827642 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f67413c-94d7-4948-aec7-086827349cc6" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.827706 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f67413c-94d7-4948-aec7-086827349cc6" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.827960 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f67413c-94d7-4948-aec7-086827349cc6" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.828056 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ec451f-d841-42fc-a6ab-3d81f62be3df" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.828130 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="1580830f-2870-436d-b982-a1775fc494bb" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.828198 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b02d76-eca4-492a-b9c7-29a77627d816" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.828286 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6134a356-636f-4379-9bd1-86db49454ca5" containerName="mariadb-account-create-update" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.828357 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="0054e5b3-ea14-46c7-8742-8f9c9ff9a705" containerName="mariadb-database-create" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.829029 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.832624 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-nova-kuttl-dockercfg-pf4x2" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.832897 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-scripts" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.833095 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.835736 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4"] Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.899817 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbchl\" (UniqueName: \"kubernetes.io/projected/e7588b4a-b06c-4e85-a2db-4750cb57d53f-kube-api-access-nbchl\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.900312 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:55 crc kubenswrapper[4842]: I0311 19:20:55.900520 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.001820 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.001871 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.001910 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbchl\" (UniqueName: \"kubernetes.io/projected/e7588b4a-b06c-4e85-a2db-4750cb57d53f-kube-api-access-nbchl\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.007770 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-scripts\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.008542 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-config-data\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.024845 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbchl\" (UniqueName: \"kubernetes.io/projected/e7588b4a-b06c-4e85-a2db-4750cb57d53f-kube-api-access-nbchl\") pod \"nova-kuttl-cell0-conductor-db-sync-kb8x4\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.046813 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.048391 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.053475 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-compute-fake1-compute-config-data" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.069325 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf"] Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.070821 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.087634 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-scripts" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.087687 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.105314 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.161377 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.163481 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf"] Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.208754 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5s7\" (UniqueName: \"kubernetes.io/projected/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-kube-api-access-km5s7\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.209117 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.209194 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjb6d\" (UniqueName: \"kubernetes.io/projected/ad288058-ff46-425c-a5e2-4313ed4e2688-kube-api-access-zjb6d\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.209239 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-config-data\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.209326 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.228485 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.233858 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.234798 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.246186 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-novncproxy-config-data" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.310927 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjb6d\" (UniqueName: \"kubernetes.io/projected/ad288058-ff46-425c-a5e2-4313ed4e2688-kube-api-access-zjb6d\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.310989 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-config-data\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.311021 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.311084 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.311123 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9bz\" (UniqueName: \"kubernetes.io/projected/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-kube-api-access-rh9bz\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.311159 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km5s7\" (UniqueName: \"kubernetes.io/projected/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-kube-api-access-km5s7\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.311182 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.317333 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-scripts\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.317420 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-config-data\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.319234 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-config-data\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.328413 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjb6d\" (UniqueName: \"kubernetes.io/projected/ad288058-ff46-425c-a5e2-4313ed4e2688-kube-api-access-zjb6d\") pod \"nova-kuttl-cell1-conductor-db-sync-xzndf\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.330470 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5s7\" (UniqueName: \"kubernetes.io/projected/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-kube-api-access-km5s7\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.412070 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.412595 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh9bz\" (UniqueName: \"kubernetes.io/projected/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-kube-api-access-rh9bz\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.418399 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-config-data\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.430164 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh9bz\" (UniqueName: \"kubernetes.io/projected/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-kube-api-access-rh9bz\") pod \"nova-kuttl-cell1-novncproxy-0\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.435950 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.469641 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.581263 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.651379 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4"] Mar 11 19:20:56 crc kubenswrapper[4842]: W0311 19:20:56.881834 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod457eeec2_b96e_4bb3_9087_3c73cb0c96c9.slice/crio-56816025f7b16028f0e1c483ab467d61d48cbaa0e7c53a280cbf168d7619a0f3 WatchSource:0}: Error finding container 56816025f7b16028f0e1c483ab467d61d48cbaa0e7c53a280cbf168d7619a0f3: Status 404 returned error can't find the container with id 56816025f7b16028f0e1c483ab467d61d48cbaa0e7c53a280cbf168d7619a0f3 Mar 11 19:20:56 crc kubenswrapper[4842]: I0311 19:20:56.887707 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.067057 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf"] Mar 11 19:20:57 crc kubenswrapper[4842]: W0311 19:20:57.070963 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad288058_ff46_425c_a5e2_4313ed4e2688.slice/crio-78196dec4f1d4876a94b2f82db523b8fed93a3b533f46dd6f15c75a102620fe7 WatchSource:0}: Error finding container 78196dec4f1d4876a94b2f82db523b8fed93a3b533f46dd6f15c75a102620fe7: Status 404 returned error can't find the container with id 78196dec4f1d4876a94b2f82db523b8fed93a3b533f46dd6f15c75a102620fe7 Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.134839 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:20:57 crc kubenswrapper[4842]: W0311 19:20:57.135130 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa949ad8_639a_4fc1_b4ae_b021fd3bd425.slice/crio-9811db1191c75507fa9b357560876cba6ca5c97156eb4778b357c88f43d62d99 WatchSource:0}: Error finding container 9811db1191c75507fa9b357560876cba6ca5c97156eb4778b357c88f43d62d99: Status 404 returned error can't find the container with id 9811db1191c75507fa9b357560876cba6ca5c97156eb4778b357c88f43d62d99 Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.514683 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" event={"ID":"e7588b4a-b06c-4e85-a2db-4750cb57d53f","Type":"ContainerStarted","Data":"598cae17a2d9fe796c2e269d380af9410e3e9831be084520c46ac9e6fa531b60"} Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.515197 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" event={"ID":"e7588b4a-b06c-4e85-a2db-4750cb57d53f","Type":"ContainerStarted","Data":"4991b07044382f28e839fcd64e1a539e0afabc8d6fb9a74d4e9617fd77158c1c"} Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.517823 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" event={"ID":"ad288058-ff46-425c-a5e2-4313ed4e2688","Type":"ContainerStarted","Data":"51468a3d139bd24a2038d0fb68f83f9f112581740abaa04bf221b7e4da368a89"} Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.517868 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" event={"ID":"ad288058-ff46-425c-a5e2-4313ed4e2688","Type":"ContainerStarted","Data":"78196dec4f1d4876a94b2f82db523b8fed93a3b533f46dd6f15c75a102620fe7"} Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.520589 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"457eeec2-b96e-4bb3-9087-3c73cb0c96c9","Type":"ContainerStarted","Data":"56816025f7b16028f0e1c483ab467d61d48cbaa0e7c53a280cbf168d7619a0f3"} Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.522733 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"fa949ad8-639a-4fc1-b4ae-b021fd3bd425","Type":"ContainerStarted","Data":"4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f"} Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.522778 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"fa949ad8-639a-4fc1-b4ae-b021fd3bd425","Type":"ContainerStarted","Data":"9811db1191c75507fa9b357560876cba6ca5c97156eb4778b357c88f43d62d99"} Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.547830 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" podStartSLOduration=2.547801735 podStartE2EDuration="2.547801735s" podCreationTimestamp="2026-03-11 19:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:57.53931691 +0000 UTC m=+1903.187013270" watchObservedRunningTime="2026-03-11 19:20:57.547801735 +0000 UTC m=+1903.195498015" Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.575214 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" podStartSLOduration=1.5751804090000001 podStartE2EDuration="1.575180409s" podCreationTimestamp="2026-03-11 19:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:57.568114892 +0000 UTC m=+1903.215811232" watchObservedRunningTime="2026-03-11 19:20:57.575180409 +0000 UTC m=+1903.222876689" Mar 11 19:20:57 crc kubenswrapper[4842]: I0311 19:20:57.588071 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podStartSLOduration=1.58804686 podStartE2EDuration="1.58804686s" podCreationTimestamp="2026-03-11 19:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:20:57.586616092 +0000 UTC m=+1903.234312402" watchObservedRunningTime="2026-03-11 19:20:57.58804686 +0000 UTC m=+1903.235743170" Mar 11 19:21:00 crc kubenswrapper[4842]: I0311 19:21:00.574666 4842 generic.go:334] "Generic (PLEG): container finished" podID="ad288058-ff46-425c-a5e2-4313ed4e2688" containerID="51468a3d139bd24a2038d0fb68f83f9f112581740abaa04bf221b7e4da368a89" exitCode=0 Mar 11 19:21:00 crc kubenswrapper[4842]: I0311 19:21:00.575263 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" event={"ID":"ad288058-ff46-425c-a5e2-4313ed4e2688","Type":"ContainerDied","Data":"51468a3d139bd24a2038d0fb68f83f9f112581740abaa04bf221b7e4da368a89"} Mar 11 19:21:01 crc kubenswrapper[4842]: I0311 19:21:01.582314 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:21:01 crc kubenswrapper[4842]: I0311 19:21:01.585402 4842 generic.go:334] "Generic (PLEG): container finished" podID="e7588b4a-b06c-4e85-a2db-4750cb57d53f" containerID="598cae17a2d9fe796c2e269d380af9410e3e9831be084520c46ac9e6fa531b60" exitCode=0 Mar 11 19:21:01 crc kubenswrapper[4842]: I0311 19:21:01.585436 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" event={"ID":"e7588b4a-b06c-4e85-a2db-4750cb57d53f","Type":"ContainerDied","Data":"598cae17a2d9fe796c2e269d380af9410e3e9831be084520c46ac9e6fa531b60"} Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.212409 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.230715 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.296702 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjb6d\" (UniqueName: \"kubernetes.io/projected/ad288058-ff46-425c-a5e2-4313ed4e2688-kube-api-access-zjb6d\") pod \"ad288058-ff46-425c-a5e2-4313ed4e2688\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.297011 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbchl\" (UniqueName: \"kubernetes.io/projected/e7588b4a-b06c-4e85-a2db-4750cb57d53f-kube-api-access-nbchl\") pod \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.297146 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-config-data\") pod \"ad288058-ff46-425c-a5e2-4313ed4e2688\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.297173 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-scripts\") pod \"ad288058-ff46-425c-a5e2-4313ed4e2688\" (UID: \"ad288058-ff46-425c-a5e2-4313ed4e2688\") " Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.297196 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-config-data\") pod \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.297311 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-scripts\") pod \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\" (UID: \"e7588b4a-b06c-4e85-a2db-4750cb57d53f\") " Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.301167 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-scripts" (OuterVolumeSpecName: "scripts") pod "e7588b4a-b06c-4e85-a2db-4750cb57d53f" (UID: "e7588b4a-b06c-4e85-a2db-4750cb57d53f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.301876 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-scripts" (OuterVolumeSpecName: "scripts") pod "ad288058-ff46-425c-a5e2-4313ed4e2688" (UID: "ad288058-ff46-425c-a5e2-4313ed4e2688"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.302837 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7588b4a-b06c-4e85-a2db-4750cb57d53f-kube-api-access-nbchl" (OuterVolumeSpecName: "kube-api-access-nbchl") pod "e7588b4a-b06c-4e85-a2db-4750cb57d53f" (UID: "e7588b4a-b06c-4e85-a2db-4750cb57d53f"). InnerVolumeSpecName "kube-api-access-nbchl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.303369 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad288058-ff46-425c-a5e2-4313ed4e2688-kube-api-access-zjb6d" (OuterVolumeSpecName: "kube-api-access-zjb6d") pod "ad288058-ff46-425c-a5e2-4313ed4e2688" (UID: "ad288058-ff46-425c-a5e2-4313ed4e2688"). InnerVolumeSpecName "kube-api-access-zjb6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.317446 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-config-data" (OuterVolumeSpecName: "config-data") pod "e7588b4a-b06c-4e85-a2db-4750cb57d53f" (UID: "e7588b4a-b06c-4e85-a2db-4750cb57d53f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.326811 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-config-data" (OuterVolumeSpecName: "config-data") pod "ad288058-ff46-425c-a5e2-4313ed4e2688" (UID: "ad288058-ff46-425c-a5e2-4313ed4e2688"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.399230 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.399288 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.399302 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7588b4a-b06c-4e85-a2db-4750cb57d53f-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.399314 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjb6d\" (UniqueName: \"kubernetes.io/projected/ad288058-ff46-425c-a5e2-4313ed4e2688-kube-api-access-zjb6d\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.399324 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbchl\" (UniqueName: \"kubernetes.io/projected/e7588b4a-b06c-4e85-a2db-4750cb57d53f-kube-api-access-nbchl\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.399333 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad288058-ff46-425c-a5e2-4313ed4e2688-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.582210 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.595910 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.633380 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" event={"ID":"ad288058-ff46-425c-a5e2-4313ed4e2688","Type":"ContainerDied","Data":"78196dec4f1d4876a94b2f82db523b8fed93a3b533f46dd6f15c75a102620fe7"} Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.633454 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78196dec4f1d4876a94b2f82db523b8fed93a3b533f46dd6f15c75a102620fe7" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.633407 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.637060 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"457eeec2-b96e-4bb3-9087-3c73cb0c96c9","Type":"ContainerStarted","Data":"ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54"} Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.637444 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.639322 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.639325 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4" event={"ID":"e7588b4a-b06c-4e85-a2db-4750cb57d53f","Type":"ContainerDied","Data":"4991b07044382f28e839fcd64e1a539e0afabc8d6fb9a74d4e9617fd77158c1c"} Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.639371 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4991b07044382f28e839fcd64e1a539e0afabc8d6fb9a74d4e9617fd77158c1c" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.657503 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.664162 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podStartSLOduration=1.277198592 podStartE2EDuration="10.664136142s" podCreationTimestamp="2026-03-11 19:20:56 +0000 UTC" firstStartedPulling="2026-03-11 19:20:56.884306362 +0000 UTC m=+1902.532002642" lastFinishedPulling="2026-03-11 19:21:06.271243912 +0000 UTC m=+1911.918940192" observedRunningTime="2026-03-11 19:21:06.661544973 +0000 UTC m=+1912.309241293" watchObservedRunningTime="2026-03-11 19:21:06.664136142 +0000 UTC m=+1912.311832462" Mar 11 19:21:06 crc kubenswrapper[4842]: I0311 19:21:06.683074 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.378515 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:21:07 crc kubenswrapper[4842]: E0311 19:21:07.379388 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad288058-ff46-425c-a5e2-4313ed4e2688" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.379413 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad288058-ff46-425c-a5e2-4313ed4e2688" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:21:07 crc kubenswrapper[4842]: E0311 19:21:07.379452 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7588b4a-b06c-4e85-a2db-4750cb57d53f" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.379463 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7588b4a-b06c-4e85-a2db-4750cb57d53f" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.379709 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad288058-ff46-425c-a5e2-4313ed4e2688" containerName="nova-kuttl-cell1-conductor-db-sync" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.379739 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7588b4a-b06c-4e85-a2db-4750cb57d53f" containerName="nova-kuttl-cell0-conductor-db-sync" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.380689 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.382759 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.390782 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.392306 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.395509 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.402554 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.408864 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.517165 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxw4c\" (UniqueName: \"kubernetes.io/projected/c808fee0-be92-4eae-9774-9d89393aacb9-kube-api-access-sxw4c\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.517546 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c808fee0-be92-4eae-9774-9d89393aacb9-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.517596 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm6k9\" (UniqueName: \"kubernetes.io/projected/540ddacd-a44d-4c6b-b382-d43c70ca2470-kube-api-access-gm6k9\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.517692 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ddacd-a44d-4c6b-b382-d43c70ca2470-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.619514 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxw4c\" (UniqueName: \"kubernetes.io/projected/c808fee0-be92-4eae-9774-9d89393aacb9-kube-api-access-sxw4c\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.619708 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c808fee0-be92-4eae-9774-9d89393aacb9-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.619740 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm6k9\" (UniqueName: \"kubernetes.io/projected/540ddacd-a44d-4c6b-b382-d43c70ca2470-kube-api-access-gm6k9\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.619775 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ddacd-a44d-4c6b-b382-d43c70ca2470-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.630903 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c808fee0-be92-4eae-9774-9d89393aacb9-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.643413 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxw4c\" (UniqueName: \"kubernetes.io/projected/c808fee0-be92-4eae-9774-9d89393aacb9-kube-api-access-sxw4c\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.651915 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ddacd-a44d-4c6b-b382-d43c70ca2470-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.653232 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm6k9\" (UniqueName: \"kubernetes.io/projected/540ddacd-a44d-4c6b-b382-d43c70ca2470-kube-api-access-gm6k9\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.695437 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:07 crc kubenswrapper[4842]: I0311 19:21:07.711041 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.160342 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:21:08 crc kubenswrapper[4842]: W0311 19:21:08.170426 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod540ddacd_a44d_4c6b_b382_d43c70ca2470.slice/crio-a2042246210792a9cb3424ca5e105d34d99238abf3d7cb31f3dde171c7199e91 WatchSource:0}: Error finding container a2042246210792a9cb3424ca5e105d34d99238abf3d7cb31f3dde171c7199e91: Status 404 returned error can't find the container with id a2042246210792a9cb3424ca5e105d34d99238abf3d7cb31f3dde171c7199e91 Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.226885 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:21:08 crc kubenswrapper[4842]: W0311 19:21:08.231833 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc808fee0_be92_4eae_9774_9d89393aacb9.slice/crio-6dcd4ad61a322b007b7b9ffd501f4683e371c73ff7bb155f540ab43bc784ba2b WatchSource:0}: Error finding container 6dcd4ad61a322b007b7b9ffd501f4683e371c73ff7bb155f540ab43bc784ba2b: Status 404 returned error can't find the container with id 6dcd4ad61a322b007b7b9ffd501f4683e371c73ff7bb155f540ab43bc784ba2b Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.658639 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"540ddacd-a44d-4c6b-b382-d43c70ca2470","Type":"ContainerStarted","Data":"9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c"} Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.658693 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.658706 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"540ddacd-a44d-4c6b-b382-d43c70ca2470","Type":"ContainerStarted","Data":"a2042246210792a9cb3424ca5e105d34d99238abf3d7cb31f3dde171c7199e91"} Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.664294 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"c808fee0-be92-4eae-9774-9d89393aacb9","Type":"ContainerStarted","Data":"36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62"} Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.664322 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"c808fee0-be92-4eae-9774-9d89393aacb9","Type":"ContainerStarted","Data":"6dcd4ad61a322b007b7b9ffd501f4683e371c73ff7bb155f540ab43bc784ba2b"} Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.664335 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:08 crc kubenswrapper[4842]: I0311 19:21:08.675641 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=1.6755889449999999 podStartE2EDuration="1.675588945s" podCreationTimestamp="2026-03-11 19:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:08.672228256 +0000 UTC m=+1914.319924536" watchObservedRunningTime="2026-03-11 19:21:08.675588945 +0000 UTC m=+1914.323285225" Mar 11 19:21:17 crc kubenswrapper[4842]: I0311 19:21:17.727830 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:21:17 crc kubenswrapper[4842]: I0311 19:21:17.737360 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:21:17 crc kubenswrapper[4842]: I0311 19:21:17.745467 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=10.745424842 podStartE2EDuration="10.745424842s" podCreationTimestamp="2026-03-11 19:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:08.690957962 +0000 UTC m=+1914.338654242" watchObservedRunningTime="2026-03-11 19:21:17.745424842 +0000 UTC m=+1923.393121132" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.159303 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.161109 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.163640 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-config-data" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.163697 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-manage-scripts" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.175029 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.196069 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.197172 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.204717 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.297080 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-config-data\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.297150 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-scripts\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.297255 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6rg\" (UniqueName: \"kubernetes.io/projected/c689843a-7097-40e5-a6dc-45b0fff3f1f9-kube-api-access-tv6rg\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.297457 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-config-data\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.297507 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-scripts\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.297669 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md8bt\" (UniqueName: \"kubernetes.io/projected/b512fa33-0320-4516-b999-738699cd428b-kube-api-access-md8bt\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.399367 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md8bt\" (UniqueName: \"kubernetes.io/projected/b512fa33-0320-4516-b999-738699cd428b-kube-api-access-md8bt\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.399430 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-config-data\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.399482 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-scripts\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.399509 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6rg\" (UniqueName: \"kubernetes.io/projected/c689843a-7097-40e5-a6dc-45b0fff3f1f9-kube-api-access-tv6rg\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.399558 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-config-data\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.399580 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-scripts\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.405051 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-scripts\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.405168 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-config-data\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.405595 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-scripts\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.406081 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-config-data\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.415187 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md8bt\" (UniqueName: \"kubernetes.io/projected/b512fa33-0320-4516-b999-738699cd428b-kube-api-access-md8bt\") pod \"nova-kuttl-cell1-host-discover-5bw4c\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.437024 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6rg\" (UniqueName: \"kubernetes.io/projected/c689843a-7097-40e5-a6dc-45b0fff3f1f9-kube-api-access-tv6rg\") pod \"nova-kuttl-cell1-cell-mapping-gv7hw\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.479653 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.495625 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.496718 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.503466 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-scripts" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.503728 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-manage-config-data" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.507741 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.517594 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.602100 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-config-data\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.602426 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jkz8\" (UniqueName: \"kubernetes.io/projected/406febbd-9625-4fa7-a281-0bb7c2a4fb19-kube-api-access-9jkz8\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.602479 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-scripts\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.677662 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.678963 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.681193 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.702338 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.704355 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-config-data\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.704410 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jkz8\" (UniqueName: \"kubernetes.io/projected/406febbd-9625-4fa7-a281-0bb7c2a4fb19-kube-api-access-9jkz8\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.704463 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-scripts\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.716099 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-scripts\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.721951 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-config-data\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.731378 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.733117 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.742711 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.757707 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jkz8\" (UniqueName: \"kubernetes.io/projected/406febbd-9625-4fa7-a281-0bb7c2a4fb19-kube-api-access-9jkz8\") pod \"nova-kuttl-cell0-cell-mapping-d5zzc\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.764514 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.781053 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.782134 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.790790 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.802307 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.808234 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26955c4-4baf-4e5b-acb8-2716483cae3a-config-data\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.808303 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26955c4-4baf-4e5b-acb8-2716483cae3a-logs\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.808426 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfsgf\" (UniqueName: \"kubernetes.io/projected/b26955c4-4baf-4e5b-acb8-2716483cae3a-kube-api-access-pfsgf\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910215 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910259 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e52316-b266-4eeb-8ffb-f410b7b278ef-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910297 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5jp\" (UniqueName: \"kubernetes.io/projected/62e52316-b266-4eeb-8ffb-f410b7b278ef-kube-api-access-wp5jp\") pod \"nova-kuttl-scheduler-0\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910357 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfsgf\" (UniqueName: \"kubernetes.io/projected/b26955c4-4baf-4e5b-acb8-2716483cae3a-kube-api-access-pfsgf\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910423 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26955c4-4baf-4e5b-acb8-2716483cae3a-config-data\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910445 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zwxl\" (UniqueName: \"kubernetes.io/projected/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-kube-api-access-6zwxl\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910466 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910488 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26955c4-4baf-4e5b-acb8-2716483cae3a-logs\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.910828 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26955c4-4baf-4e5b-acb8-2716483cae3a-logs\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.914931 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26955c4-4baf-4e5b-acb8-2716483cae3a-config-data\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.929743 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfsgf\" (UniqueName: \"kubernetes.io/projected/b26955c4-4baf-4e5b-acb8-2716483cae3a-kube-api-access-pfsgf\") pod \"nova-kuttl-api-0\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:18 crc kubenswrapper[4842]: I0311 19:21:18.941992 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.011791 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zwxl\" (UniqueName: \"kubernetes.io/projected/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-kube-api-access-6zwxl\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.011863 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.011899 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.011918 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e52316-b266-4eeb-8ffb-f410b7b278ef-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.011943 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5jp\" (UniqueName: \"kubernetes.io/projected/62e52316-b266-4eeb-8ffb-f410b7b278ef-kube-api-access-wp5jp\") pod \"nova-kuttl-scheduler-0\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.012911 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.013775 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw"] Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.015550 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.015859 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e52316-b266-4eeb-8ffb-f410b7b278ef-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.035789 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zwxl\" (UniqueName: \"kubernetes.io/projected/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-kube-api-access-6zwxl\") pod \"nova-kuttl-metadata-0\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.035855 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.038877 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5jp\" (UniqueName: \"kubernetes.io/projected/62e52316-b266-4eeb-8ffb-f410b7b278ef-kube-api-access-wp5jp\") pod \"nova-kuttl-scheduler-0\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.068863 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.105912 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c"] Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.108791 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.427657 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc"] Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.539609 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:19 crc kubenswrapper[4842]: W0311 19:21:19.554652 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26955c4_4baf_4e5b_acb8_2716483cae3a.slice/crio-ba0821f6eaee4ac50945ab9575b38039a3b17d672f24f4b0279424f648d17206 WatchSource:0}: Error finding container ba0821f6eaee4ac50945ab9575b38039a3b17d672f24f4b0279424f648d17206: Status 404 returned error can't find the container with id ba0821f6eaee4ac50945ab9575b38039a3b17d672f24f4b0279424f648d17206 Mar 11 19:21:19 crc kubenswrapper[4842]: W0311 19:21:19.664425 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62e52316_b266_4eeb_8ffb_f410b7b278ef.slice/crio-8daa6a9475808af5c2237934c09c06386bd18f17a8fa8f731571fddeb63ef926 WatchSource:0}: Error finding container 8daa6a9475808af5c2237934c09c06386bd18f17a8fa8f731571fddeb63ef926: Status 404 returned error can't find the container with id 8daa6a9475808af5c2237934c09c06386bd18f17a8fa8f731571fddeb63ef926 Mar 11 19:21:19 crc kubenswrapper[4842]: W0311 19:21:19.669751 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d89a6a3_6c92_44a6_8593_aef4bf6ab733.slice/crio-bedff409c0a9a0b29acb2fd513671864f4c1096cadd84a52fd910fdea21454f9 WatchSource:0}: Error finding container bedff409c0a9a0b29acb2fd513671864f4c1096cadd84a52fd910fdea21454f9: Status 404 returned error can't find the container with id bedff409c0a9a0b29acb2fd513671864f4c1096cadd84a52fd910fdea21454f9 Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.672774 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.679677 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.788470 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4d89a6a3-6c92-44a6-8593-aef4bf6ab733","Type":"ContainerStarted","Data":"bedff409c0a9a0b29acb2fd513671864f4c1096cadd84a52fd910fdea21454f9"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.791106 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"62e52316-b266-4eeb-8ffb-f410b7b278ef","Type":"ContainerStarted","Data":"8daa6a9475808af5c2237934c09c06386bd18f17a8fa8f731571fddeb63ef926"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.792627 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" event={"ID":"406febbd-9625-4fa7-a281-0bb7c2a4fb19","Type":"ContainerStarted","Data":"a6022800a9ccab7df547181656fff0ff01039b471cafe4b9e7acea3e8c90c077"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.792706 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" event={"ID":"406febbd-9625-4fa7-a281-0bb7c2a4fb19","Type":"ContainerStarted","Data":"bfea8917643db82a167014fb82046eeb2b7c47b7a7e08df8122a4254244c342f"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.794481 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" event={"ID":"b512fa33-0320-4516-b999-738699cd428b","Type":"ContainerStarted","Data":"516613b031a24f35699505c859760ef198a811ebbfa154f57aabb70ff4e50832"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.794747 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" event={"ID":"b512fa33-0320-4516-b999-738699cd428b","Type":"ContainerStarted","Data":"c089177aed67b61ec7ada85453cadb2441f6cb530e77da8d5f4e1186de3073d7"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.800252 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" event={"ID":"c689843a-7097-40e5-a6dc-45b0fff3f1f9","Type":"ContainerStarted","Data":"3f3fe37d88ff32c4a0913783352ab36605afb090e771605be9e8df6ecb0f7be5"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.800324 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" event={"ID":"c689843a-7097-40e5-a6dc-45b0fff3f1f9","Type":"ContainerStarted","Data":"53c4f2c68f3b46871b752e241f23f50909032274e9eaf52ac61aca4a6f4cd986"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.805984 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b26955c4-4baf-4e5b-acb8-2716483cae3a","Type":"ContainerStarted","Data":"44b9914679c451c16cd6a94147cc2c93a12d381af183b9cd2a03610828a14c22"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.806031 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b26955c4-4baf-4e5b-acb8-2716483cae3a","Type":"ContainerStarted","Data":"ba0821f6eaee4ac50945ab9575b38039a3b17d672f24f4b0279424f648d17206"} Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.814049 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" podStartSLOduration=1.814027838 podStartE2EDuration="1.814027838s" podCreationTimestamp="2026-03-11 19:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:19.806476068 +0000 UTC m=+1925.454172348" watchObservedRunningTime="2026-03-11 19:21:19.814027838 +0000 UTC m=+1925.461724118" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.827011 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" podStartSLOduration=1.826988601 podStartE2EDuration="1.826988601s" podCreationTimestamp="2026-03-11 19:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:19.818881337 +0000 UTC m=+1925.466577617" watchObservedRunningTime="2026-03-11 19:21:19.826988601 +0000 UTC m=+1925.474684881" Mar 11 19:21:19 crc kubenswrapper[4842]: I0311 19:21:19.849414 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" podStartSLOduration=1.8493970640000001 podStartE2EDuration="1.849397064s" podCreationTimestamp="2026-03-11 19:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:19.844781992 +0000 UTC m=+1925.492478292" watchObservedRunningTime="2026-03-11 19:21:19.849397064 +0000 UTC m=+1925.497093344" Mar 11 19:21:20 crc kubenswrapper[4842]: I0311 19:21:20.820693 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b26955c4-4baf-4e5b-acb8-2716483cae3a","Type":"ContainerStarted","Data":"9397f16a6088147191cddd546a6d58b722d370522b2f40414279e78f27e264ab"} Mar 11 19:21:20 crc kubenswrapper[4842]: I0311 19:21:20.824726 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4d89a6a3-6c92-44a6-8593-aef4bf6ab733","Type":"ContainerStarted","Data":"3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411"} Mar 11 19:21:20 crc kubenswrapper[4842]: I0311 19:21:20.824773 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4d89a6a3-6c92-44a6-8593-aef4bf6ab733","Type":"ContainerStarted","Data":"f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd"} Mar 11 19:21:20 crc kubenswrapper[4842]: I0311 19:21:20.828032 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"62e52316-b266-4eeb-8ffb-f410b7b278ef","Type":"ContainerStarted","Data":"0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4"} Mar 11 19:21:20 crc kubenswrapper[4842]: I0311 19:21:20.856516 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.856491631 podStartE2EDuration="2.856491631s" podCreationTimestamp="2026-03-11 19:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:20.844858203 +0000 UTC m=+1926.492554483" watchObservedRunningTime="2026-03-11 19:21:20.856491631 +0000 UTC m=+1926.504187911" Mar 11 19:21:20 crc kubenswrapper[4842]: I0311 19:21:20.876935 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.8768589799999997 podStartE2EDuration="2.87685898s" podCreationTimestamp="2026-03-11 19:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:20.871629592 +0000 UTC m=+1926.519325882" watchObservedRunningTime="2026-03-11 19:21:20.87685898 +0000 UTC m=+1926.524555260" Mar 11 19:21:20 crc kubenswrapper[4842]: I0311 19:21:20.891721 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.891695423 podStartE2EDuration="2.891695423s" podCreationTimestamp="2026-03-11 19:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:20.889030993 +0000 UTC m=+1926.536727273" watchObservedRunningTime="2026-03-11 19:21:20.891695423 +0000 UTC m=+1926.539391713" Mar 11 19:21:22 crc kubenswrapper[4842]: I0311 19:21:22.849802 4842 generic.go:334] "Generic (PLEG): container finished" podID="b512fa33-0320-4516-b999-738699cd428b" containerID="516613b031a24f35699505c859760ef198a811ebbfa154f57aabb70ff4e50832" exitCode=255 Mar 11 19:21:22 crc kubenswrapper[4842]: I0311 19:21:22.849870 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" event={"ID":"b512fa33-0320-4516-b999-738699cd428b","Type":"ContainerDied","Data":"516613b031a24f35699505c859760ef198a811ebbfa154f57aabb70ff4e50832"} Mar 11 19:21:22 crc kubenswrapper[4842]: I0311 19:21:22.850875 4842 scope.go:117] "RemoveContainer" containerID="516613b031a24f35699505c859760ef198a811ebbfa154f57aabb70ff4e50832" Mar 11 19:21:23 crc kubenswrapper[4842]: I0311 19:21:23.862690 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" event={"ID":"b512fa33-0320-4516-b999-738699cd428b","Type":"ContainerStarted","Data":"3bc7e8588559d0df6449aafc7650dcaeeebc33d95898d976b0d7424329f9aa89"} Mar 11 19:21:24 crc kubenswrapper[4842]: I0311 19:21:24.109614 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:24 crc kubenswrapper[4842]: I0311 19:21:24.874989 4842 generic.go:334] "Generic (PLEG): container finished" podID="406febbd-9625-4fa7-a281-0bb7c2a4fb19" containerID="a6022800a9ccab7df547181656fff0ff01039b471cafe4b9e7acea3e8c90c077" exitCode=0 Mar 11 19:21:24 crc kubenswrapper[4842]: I0311 19:21:24.875115 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" event={"ID":"406febbd-9625-4fa7-a281-0bb7c2a4fb19","Type":"ContainerDied","Data":"a6022800a9ccab7df547181656fff0ff01039b471cafe4b9e7acea3e8c90c077"} Mar 11 19:21:24 crc kubenswrapper[4842]: I0311 19:21:24.883260 4842 generic.go:334] "Generic (PLEG): container finished" podID="c689843a-7097-40e5-a6dc-45b0fff3f1f9" containerID="3f3fe37d88ff32c4a0913783352ab36605afb090e771605be9e8df6ecb0f7be5" exitCode=0 Mar 11 19:21:24 crc kubenswrapper[4842]: I0311 19:21:24.883708 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" event={"ID":"c689843a-7097-40e5-a6dc-45b0fff3f1f9","Type":"ContainerDied","Data":"3f3fe37d88ff32c4a0913783352ab36605afb090e771605be9e8df6ecb0f7be5"} Mar 11 19:21:25 crc kubenswrapper[4842]: I0311 19:21:25.894850 4842 generic.go:334] "Generic (PLEG): container finished" podID="b512fa33-0320-4516-b999-738699cd428b" containerID="3bc7e8588559d0df6449aafc7650dcaeeebc33d95898d976b0d7424329f9aa89" exitCode=0 Mar 11 19:21:25 crc kubenswrapper[4842]: I0311 19:21:25.894890 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" event={"ID":"b512fa33-0320-4516-b999-738699cd428b","Type":"ContainerDied","Data":"3bc7e8588559d0df6449aafc7650dcaeeebc33d95898d976b0d7424329f9aa89"} Mar 11 19:21:25 crc kubenswrapper[4842]: I0311 19:21:25.895298 4842 scope.go:117] "RemoveContainer" containerID="516613b031a24f35699505c859760ef198a811ebbfa154f57aabb70ff4e50832" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.327526 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.334738 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.455937 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-config-data\") pod \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.456941 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-config-data\") pod \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.457023 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-scripts\") pod \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.457058 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv6rg\" (UniqueName: \"kubernetes.io/projected/c689843a-7097-40e5-a6dc-45b0fff3f1f9-kube-api-access-tv6rg\") pod \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.457085 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jkz8\" (UniqueName: \"kubernetes.io/projected/406febbd-9625-4fa7-a281-0bb7c2a4fb19-kube-api-access-9jkz8\") pod \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\" (UID: \"406febbd-9625-4fa7-a281-0bb7c2a4fb19\") " Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.457110 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-scripts\") pod \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\" (UID: \"c689843a-7097-40e5-a6dc-45b0fff3f1f9\") " Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.461930 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406febbd-9625-4fa7-a281-0bb7c2a4fb19-kube-api-access-9jkz8" (OuterVolumeSpecName: "kube-api-access-9jkz8") pod "406febbd-9625-4fa7-a281-0bb7c2a4fb19" (UID: "406febbd-9625-4fa7-a281-0bb7c2a4fb19"). InnerVolumeSpecName "kube-api-access-9jkz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.462049 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c689843a-7097-40e5-a6dc-45b0fff3f1f9-kube-api-access-tv6rg" (OuterVolumeSpecName: "kube-api-access-tv6rg") pod "c689843a-7097-40e5-a6dc-45b0fff3f1f9" (UID: "c689843a-7097-40e5-a6dc-45b0fff3f1f9"). InnerVolumeSpecName "kube-api-access-tv6rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.462190 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-scripts" (OuterVolumeSpecName: "scripts") pod "406febbd-9625-4fa7-a281-0bb7c2a4fb19" (UID: "406febbd-9625-4fa7-a281-0bb7c2a4fb19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.462560 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-scripts" (OuterVolumeSpecName: "scripts") pod "c689843a-7097-40e5-a6dc-45b0fff3f1f9" (UID: "c689843a-7097-40e5-a6dc-45b0fff3f1f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.479939 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-config-data" (OuterVolumeSpecName: "config-data") pod "406febbd-9625-4fa7-a281-0bb7c2a4fb19" (UID: "406febbd-9625-4fa7-a281-0bb7c2a4fb19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.484187 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-config-data" (OuterVolumeSpecName: "config-data") pod "c689843a-7097-40e5-a6dc-45b0fff3f1f9" (UID: "c689843a-7097-40e5-a6dc-45b0fff3f1f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.559669 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.559713 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.559729 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406febbd-9625-4fa7-a281-0bb7c2a4fb19-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.559740 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv6rg\" (UniqueName: \"kubernetes.io/projected/c689843a-7097-40e5-a6dc-45b0fff3f1f9-kube-api-access-tv6rg\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.559754 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jkz8\" (UniqueName: \"kubernetes.io/projected/406febbd-9625-4fa7-a281-0bb7c2a4fb19-kube-api-access-9jkz8\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.559765 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c689843a-7097-40e5-a6dc-45b0fff3f1f9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.906560 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" event={"ID":"406febbd-9625-4fa7-a281-0bb7c2a4fb19","Type":"ContainerDied","Data":"bfea8917643db82a167014fb82046eeb2b7c47b7a7e08df8122a4254244c342f"} Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.907305 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfea8917643db82a167014fb82046eeb2b7c47b7a7e08df8122a4254244c342f" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.906803 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.915228 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" event={"ID":"c689843a-7097-40e5-a6dc-45b0fff3f1f9","Type":"ContainerDied","Data":"53c4f2c68f3b46871b752e241f23f50909032274e9eaf52ac61aca4a6f4cd986"} Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.915486 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53c4f2c68f3b46871b752e241f23f50909032274e9eaf52ac61aca4a6f4cd986" Mar 11 19:21:26 crc kubenswrapper[4842]: I0311 19:21:26.915304 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.107117 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.107893 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-log" containerID="cri-o://44b9914679c451c16cd6a94147cc2c93a12d381af183b9cd2a03610828a14c22" gracePeriod=30 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.108359 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-api" containerID="cri-o://9397f16a6088147191cddd546a6d58b722d370522b2f40414279e78f27e264ab" gracePeriod=30 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.147217 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.147453 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="62e52316-b266-4eeb-8ffb-f410b7b278ef" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4" gracePeriod=30 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.253598 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.253880 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-log" containerID="cri-o://f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd" gracePeriod=30 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.253965 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411" gracePeriod=30 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.290508 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.477408 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-config-data\") pod \"b512fa33-0320-4516-b999-738699cd428b\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.477522 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-scripts\") pod \"b512fa33-0320-4516-b999-738699cd428b\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.477695 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md8bt\" (UniqueName: \"kubernetes.io/projected/b512fa33-0320-4516-b999-738699cd428b-kube-api-access-md8bt\") pod \"b512fa33-0320-4516-b999-738699cd428b\" (UID: \"b512fa33-0320-4516-b999-738699cd428b\") " Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.481421 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b512fa33-0320-4516-b999-738699cd428b-kube-api-access-md8bt" (OuterVolumeSpecName: "kube-api-access-md8bt") pod "b512fa33-0320-4516-b999-738699cd428b" (UID: "b512fa33-0320-4516-b999-738699cd428b"). InnerVolumeSpecName "kube-api-access-md8bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.482431 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-scripts" (OuterVolumeSpecName: "scripts") pod "b512fa33-0320-4516-b999-738699cd428b" (UID: "b512fa33-0320-4516-b999-738699cd428b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.497900 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-config-data" (OuterVolumeSpecName: "config-data") pod "b512fa33-0320-4516-b999-738699cd428b" (UID: "b512fa33-0320-4516-b999-738699cd428b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.579595 4842 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.579636 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md8bt\" (UniqueName: \"kubernetes.io/projected/b512fa33-0320-4516-b999-738699cd428b-kube-api-access-md8bt\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.579649 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b512fa33-0320-4516-b999-738699cd428b-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.928340 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" event={"ID":"b512fa33-0320-4516-b999-738699cd428b","Type":"ContainerDied","Data":"c089177aed67b61ec7ada85453cadb2441f6cb530e77da8d5f4e1186de3073d7"} Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.928411 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.928434 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c089177aed67b61ec7ada85453cadb2441f6cb530e77da8d5f4e1186de3073d7" Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.931064 4842 generic.go:334] "Generic (PLEG): container finished" podID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerID="9397f16a6088147191cddd546a6d58b722d370522b2f40414279e78f27e264ab" exitCode=0 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.931222 4842 generic.go:334] "Generic (PLEG): container finished" podID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerID="44b9914679c451c16cd6a94147cc2c93a12d381af183b9cd2a03610828a14c22" exitCode=143 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.931504 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b26955c4-4baf-4e5b-acb8-2716483cae3a","Type":"ContainerDied","Data":"9397f16a6088147191cddd546a6d58b722d370522b2f40414279e78f27e264ab"} Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.931658 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b26955c4-4baf-4e5b-acb8-2716483cae3a","Type":"ContainerDied","Data":"44b9914679c451c16cd6a94147cc2c93a12d381af183b9cd2a03610828a14c22"} Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.933920 4842 generic.go:334] "Generic (PLEG): container finished" podID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerID="f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd" exitCode=143 Mar 11 19:21:27 crc kubenswrapper[4842]: I0311 19:21:27.933980 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4d89a6a3-6c92-44a6-8593-aef4bf6ab733","Type":"ContainerDied","Data":"f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd"} Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.430031 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.603323 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfsgf\" (UniqueName: \"kubernetes.io/projected/b26955c4-4baf-4e5b-acb8-2716483cae3a-kube-api-access-pfsgf\") pod \"b26955c4-4baf-4e5b-acb8-2716483cae3a\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.603449 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26955c4-4baf-4e5b-acb8-2716483cae3a-logs\") pod \"b26955c4-4baf-4e5b-acb8-2716483cae3a\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.603550 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26955c4-4baf-4e5b-acb8-2716483cae3a-config-data\") pod \"b26955c4-4baf-4e5b-acb8-2716483cae3a\" (UID: \"b26955c4-4baf-4e5b-acb8-2716483cae3a\") " Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.603955 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b26955c4-4baf-4e5b-acb8-2716483cae3a-logs" (OuterVolumeSpecName: "logs") pod "b26955c4-4baf-4e5b-acb8-2716483cae3a" (UID: "b26955c4-4baf-4e5b-acb8-2716483cae3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.604092 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26955c4-4baf-4e5b-acb8-2716483cae3a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.608005 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26955c4-4baf-4e5b-acb8-2716483cae3a-kube-api-access-pfsgf" (OuterVolumeSpecName: "kube-api-access-pfsgf") pod "b26955c4-4baf-4e5b-acb8-2716483cae3a" (UID: "b26955c4-4baf-4e5b-acb8-2716483cae3a"). InnerVolumeSpecName "kube-api-access-pfsgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.631611 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26955c4-4baf-4e5b-acb8-2716483cae3a-config-data" (OuterVolumeSpecName: "config-data") pod "b26955c4-4baf-4e5b-acb8-2716483cae3a" (UID: "b26955c4-4baf-4e5b-acb8-2716483cae3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.705348 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26955c4-4baf-4e5b-acb8-2716483cae3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.705393 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfsgf\" (UniqueName: \"kubernetes.io/projected/b26955c4-4baf-4e5b-acb8-2716483cae3a-kube-api-access-pfsgf\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.842882 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.950423 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.950422 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"b26955c4-4baf-4e5b-acb8-2716483cae3a","Type":"ContainerDied","Data":"ba0821f6eaee4ac50945ab9575b38039a3b17d672f24f4b0279424f648d17206"} Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.950627 4842 scope.go:117] "RemoveContainer" containerID="9397f16a6088147191cddd546a6d58b722d370522b2f40414279e78f27e264ab" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.954423 4842 generic.go:334] "Generic (PLEG): container finished" podID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerID="3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411" exitCode=0 Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.954486 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.954484 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4d89a6a3-6c92-44a6-8593-aef4bf6ab733","Type":"ContainerDied","Data":"3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411"} Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.954732 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"4d89a6a3-6c92-44a6-8593-aef4bf6ab733","Type":"ContainerDied","Data":"bedff409c0a9a0b29acb2fd513671864f4c1096cadd84a52fd910fdea21454f9"} Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.973133 4842 scope.go:117] "RemoveContainer" containerID="44b9914679c451c16cd6a94147cc2c93a12d381af183b9cd2a03610828a14c22" Mar 11 19:21:28 crc kubenswrapper[4842]: I0311 19:21:28.990057 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.000485 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.007524 4842 scope.go:117] "RemoveContainer" containerID="3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.012731 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-config-data\") pod \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.012897 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-logs\") pod \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.013049 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zwxl\" (UniqueName: \"kubernetes.io/projected/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-kube-api-access-6zwxl\") pod \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\" (UID: \"4d89a6a3-6c92-44a6-8593-aef4bf6ab733\") " Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.014249 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-logs" (OuterVolumeSpecName: "logs") pod "4d89a6a3-6c92-44a6-8593-aef4bf6ab733" (UID: "4d89a6a3-6c92-44a6-8593-aef4bf6ab733"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.018533 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-kube-api-access-6zwxl" (OuterVolumeSpecName: "kube-api-access-6zwxl") pod "4d89a6a3-6c92-44a6-8593-aef4bf6ab733" (UID: "4d89a6a3-6c92-44a6-8593-aef4bf6ab733"). InnerVolumeSpecName "kube-api-access-6zwxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029186 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.029718 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b512fa33-0320-4516-b999-738699cd428b" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029737 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b512fa33-0320-4516-b999-738699cd428b" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.029760 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c689843a-7097-40e5-a6dc-45b0fff3f1f9" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029770 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c689843a-7097-40e5-a6dc-45b0fff3f1f9" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.029786 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-log" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029794 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-log" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.029825 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-log" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029834 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-log" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.029851 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406febbd-9625-4fa7-a281-0bb7c2a4fb19" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029859 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="406febbd-9625-4fa7-a281-0bb7c2a4fb19" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.029880 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-metadata" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029888 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-metadata" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.029911 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-api" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.029922 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-api" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030103 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-api" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030123 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-log" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030143 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" containerName="nova-kuttl-api-log" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030158 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="406febbd-9625-4fa7-a281-0bb7c2a4fb19" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030170 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b512fa33-0320-4516-b999-738699cd428b" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030181 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c689843a-7097-40e5-a6dc-45b0fff3f1f9" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030191 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" containerName="nova-kuttl-metadata-metadata" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030207 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b512fa33-0320-4516-b999-738699cd428b" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.030441 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b512fa33-0320-4516-b999-738699cd428b" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.030455 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b512fa33-0320-4516-b999-738699cd428b" containerName="nova-manage" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.031575 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.034720 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.035138 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.049133 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-config-data" (OuterVolumeSpecName: "config-data") pod "4d89a6a3-6c92-44a6-8593-aef4bf6ab733" (UID: "4d89a6a3-6c92-44a6-8593-aef4bf6ab733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.112232 4842 scope.go:117] "RemoveContainer" containerID="f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.115687 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zwxl\" (UniqueName: \"kubernetes.io/projected/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-kube-api-access-6zwxl\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.115715 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.115726 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d89a6a3-6c92-44a6-8593-aef4bf6ab733-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.134039 4842 scope.go:117] "RemoveContainer" containerID="3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.134773 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411\": container with ID starting with 3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411 not found: ID does not exist" containerID="3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.134857 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411"} err="failed to get container status \"3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411\": rpc error: code = NotFound desc = could not find container \"3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411\": container with ID starting with 3b09033809d0c9383c4133b7de168de4a8e327594109775f9b81643f2879e411 not found: ID does not exist" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.134919 4842 scope.go:117] "RemoveContainer" containerID="f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd" Mar 11 19:21:29 crc kubenswrapper[4842]: E0311 19:21:29.135356 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd\": container with ID starting with f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd not found: ID does not exist" containerID="f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.135387 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd"} err="failed to get container status \"f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd\": rpc error: code = NotFound desc = could not find container \"f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd\": container with ID starting with f662543c622f48560d176346c1069295150a15a82364f34dd1299fd9eb9d02dd not found: ID does not exist" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.216875 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c193b9-90ca-4bba-9958-65c01e8a64e0-config-data\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.216957 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzs49\" (UniqueName: \"kubernetes.io/projected/71c193b9-90ca-4bba-9958-65c01e8a64e0-kube-api-access-vzs49\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.217209 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c193b9-90ca-4bba-9958-65c01e8a64e0-logs\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.298069 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.307670 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.319130 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c193b9-90ca-4bba-9958-65c01e8a64e0-logs\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.319342 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c193b9-90ca-4bba-9958-65c01e8a64e0-config-data\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.319375 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzs49\" (UniqueName: \"kubernetes.io/projected/71c193b9-90ca-4bba-9958-65c01e8a64e0-kube-api-access-vzs49\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.319664 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c193b9-90ca-4bba-9958-65c01e8a64e0-logs\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.323046 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.325418 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c193b9-90ca-4bba-9958-65c01e8a64e0-config-data\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.326992 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.345443 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.351442 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-metadata-config-data" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.376847 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzs49\" (UniqueName: \"kubernetes.io/projected/71c193b9-90ca-4bba-9958-65c01e8a64e0-kube-api-access-vzs49\") pod \"nova-kuttl-api-0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.407728 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.421512 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d786c50e-6bfb-4b96-bec9-cfa618ab848a-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.421636 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d786c50e-6bfb-4b96-bec9-cfa618ab848a-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.421721 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv62b\" (UniqueName: \"kubernetes.io/projected/d786c50e-6bfb-4b96-bec9-cfa618ab848a-kube-api-access-zv62b\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.524192 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d786c50e-6bfb-4b96-bec9-cfa618ab848a-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.524284 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d786c50e-6bfb-4b96-bec9-cfa618ab848a-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.524328 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv62b\" (UniqueName: \"kubernetes.io/projected/d786c50e-6bfb-4b96-bec9-cfa618ab848a-kube-api-access-zv62b\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.525426 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d786c50e-6bfb-4b96-bec9-cfa618ab848a-logs\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.528457 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d786c50e-6bfb-4b96-bec9-cfa618ab848a-config-data\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.543336 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv62b\" (UniqueName: \"kubernetes.io/projected/d786c50e-6bfb-4b96-bec9-cfa618ab848a-kube-api-access-zv62b\") pod \"nova-kuttl-metadata-0\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.731371 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.909569 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:21:29 crc kubenswrapper[4842]: I0311 19:21:29.964428 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"71c193b9-90ca-4bba-9958-65c01e8a64e0","Type":"ContainerStarted","Data":"603b37999beb56878105a8ea81f9443adfe3d336d596494241eaf660c7276751"} Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.149706 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.328075 4842 scope.go:117] "RemoveContainer" containerID="2ccd974ce721f039dbf24694518bc042cdca375b64ef14687b821187c50c178c" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.353058 4842 scope.go:117] "RemoveContainer" containerID="f1a58b8d1fd964a5881734d0dc88d1f8301bcc4d29bf3251334dfee0e923a2ee" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.384054 4842 scope.go:117] "RemoveContainer" containerID="9b0c6736233c151848d04334b1e4be1d1e78c89c2e64a02c249766767e00286d" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.436615 4842 scope.go:117] "RemoveContainer" containerID="a98de39e24d997969f3f4c625aa016ed40a06b93652aa6a2a560eef9fb2ad795" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.484563 4842 scope.go:117] "RemoveContainer" containerID="586fd9696e7e5d4b8dba356243d03e667d6737a2bbb02c29994a7397c74731da" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.527341 4842 scope.go:117] "RemoveContainer" containerID="51711445d91b5b961f46016b717fdf0be2dde3b84c1b94b4bacf42d1eec37b57" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.550876 4842 scope.go:117] "RemoveContainer" containerID="0ffe9736c972d7f402106d5386a00c71fa67bef70d8cc297e74b639ceea37bda" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.575201 4842 scope.go:117] "RemoveContainer" containerID="52edfb9d8d37e047f5d0d37b453e2827b926eab4c1c4b4046c14ed42e68bdf82" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.607632 4842 scope.go:117] "RemoveContainer" containerID="5ed25c94a4ce838627dccb27f3a92ad987f8c96a38b142186bc118b557a100a0" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.648230 4842 scope.go:117] "RemoveContainer" containerID="ed8c4dc680cfc8295bd431eb3cacc2435203d6b35b9916942752ad00dbc9996d" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.669314 4842 scope.go:117] "RemoveContainer" containerID="9b41acb1732e8d27ffba758b7a44e575b9835935296b698ba4017583310a1799" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.706675 4842 scope.go:117] "RemoveContainer" containerID="ea4c54b26c9ad7259977dddef7f0094084dd6dc1317bfef6bbc8086968940002" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.731186 4842 scope.go:117] "RemoveContainer" containerID="764686a44630d563dbf64bed8e7376038bbc92363e11f7ed0ab264ddde643bfc" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.971482 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d89a6a3-6c92-44a6-8593-aef4bf6ab733" path="/var/lib/kubelet/pods/4d89a6a3-6c92-44a6-8593-aef4bf6ab733/volumes" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.972361 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26955c4-4baf-4e5b-acb8-2716483cae3a" path="/var/lib/kubelet/pods/b26955c4-4baf-4e5b-acb8-2716483cae3a/volumes" Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.984466 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"d786c50e-6bfb-4b96-bec9-cfa618ab848a","Type":"ContainerStarted","Data":"4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e"} Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.984509 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"d786c50e-6bfb-4b96-bec9-cfa618ab848a","Type":"ContainerStarted","Data":"46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7"} Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.984523 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"d786c50e-6bfb-4b96-bec9-cfa618ab848a","Type":"ContainerStarted","Data":"877bf2a539615cbfbcacce68034981e4d4901ab49bef3b48073081dfe7067a4e"} Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.994491 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"71c193b9-90ca-4bba-9958-65c01e8a64e0","Type":"ContainerStarted","Data":"54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723"} Mar 11 19:21:30 crc kubenswrapper[4842]: I0311 19:21:30.994535 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"71c193b9-90ca-4bba-9958-65c01e8a64e0","Type":"ContainerStarted","Data":"6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121"} Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.016562 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-metadata-0" podStartSLOduration=2.016527945 podStartE2EDuration="2.016527945s" podCreationTimestamp="2026-03-11 19:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:31.000359367 +0000 UTC m=+1936.648055657" watchObservedRunningTime="2026-03-11 19:21:31.016527945 +0000 UTC m=+1936.664224265" Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.028919 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=3.028901383 podStartE2EDuration="3.028901383s" podCreationTimestamp="2026-03-11 19:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:31.019041652 +0000 UTC m=+1936.666737922" watchObservedRunningTime="2026-03-11 19:21:31.028901383 +0000 UTC m=+1936.676597663" Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.455884 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.561233 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp5jp\" (UniqueName: \"kubernetes.io/projected/62e52316-b266-4eeb-8ffb-f410b7b278ef-kube-api-access-wp5jp\") pod \"62e52316-b266-4eeb-8ffb-f410b7b278ef\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.561452 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e52316-b266-4eeb-8ffb-f410b7b278ef-config-data\") pod \"62e52316-b266-4eeb-8ffb-f410b7b278ef\" (UID: \"62e52316-b266-4eeb-8ffb-f410b7b278ef\") " Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.567153 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e52316-b266-4eeb-8ffb-f410b7b278ef-kube-api-access-wp5jp" (OuterVolumeSpecName: "kube-api-access-wp5jp") pod "62e52316-b266-4eeb-8ffb-f410b7b278ef" (UID: "62e52316-b266-4eeb-8ffb-f410b7b278ef"). InnerVolumeSpecName "kube-api-access-wp5jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.589030 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e52316-b266-4eeb-8ffb-f410b7b278ef-config-data" (OuterVolumeSpecName: "config-data") pod "62e52316-b266-4eeb-8ffb-f410b7b278ef" (UID: "62e52316-b266-4eeb-8ffb-f410b7b278ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.663985 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp5jp\" (UniqueName: \"kubernetes.io/projected/62e52316-b266-4eeb-8ffb-f410b7b278ef-kube-api-access-wp5jp\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:31 crc kubenswrapper[4842]: I0311 19:21:31.664033 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e52316-b266-4eeb-8ffb-f410b7b278ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.006112 4842 generic.go:334] "Generic (PLEG): container finished" podID="62e52316-b266-4eeb-8ffb-f410b7b278ef" containerID="0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4" exitCode=0 Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.006194 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.006236 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"62e52316-b266-4eeb-8ffb-f410b7b278ef","Type":"ContainerDied","Data":"0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4"} Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.006317 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"62e52316-b266-4eeb-8ffb-f410b7b278ef","Type":"ContainerDied","Data":"8daa6a9475808af5c2237934c09c06386bd18f17a8fa8f731571fddeb63ef926"} Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.006346 4842 scope.go:117] "RemoveContainer" containerID="0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.040023 4842 scope.go:117] "RemoveContainer" containerID="0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4" Mar 11 19:21:32 crc kubenswrapper[4842]: E0311 19:21:32.040892 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4\": container with ID starting with 0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4 not found: ID does not exist" containerID="0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.040977 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4"} err="failed to get container status \"0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4\": rpc error: code = NotFound desc = could not find container \"0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4\": container with ID starting with 0dc703fe2a53558f96cd289dcf633e25b9a8ab1f47b7ce0b27e4166b40d8fda4 not found: ID does not exist" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.107687 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.120072 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.129730 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:32 crc kubenswrapper[4842]: E0311 19:21:32.130210 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e52316-b266-4eeb-8ffb-f410b7b278ef" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.130232 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e52316-b266-4eeb-8ffb-f410b7b278ef" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.130508 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e52316-b266-4eeb-8ffb-f410b7b278ef" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.131425 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.133585 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.139755 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.286316 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722e1368-2318-4f0e-a6d3-199f26cccc14-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.286387 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljq6r\" (UniqueName: \"kubernetes.io/projected/722e1368-2318-4f0e-a6d3-199f26cccc14-kube-api-access-ljq6r\") pod \"nova-kuttl-scheduler-0\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.388505 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722e1368-2318-4f0e-a6d3-199f26cccc14-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.388568 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljq6r\" (UniqueName: \"kubernetes.io/projected/722e1368-2318-4f0e-a6d3-199f26cccc14-kube-api-access-ljq6r\") pod \"nova-kuttl-scheduler-0\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.400125 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722e1368-2318-4f0e-a6d3-199f26cccc14-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.409132 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljq6r\" (UniqueName: \"kubernetes.io/projected/722e1368-2318-4f0e-a6d3-199f26cccc14-kube-api-access-ljq6r\") pod \"nova-kuttl-scheduler-0\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.460525 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.729129 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:21:32 crc kubenswrapper[4842]: W0311 19:21:32.734749 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod722e1368_2318_4f0e_a6d3_199f26cccc14.slice/crio-92898f7470c21cb4fb14a27b64968a7430217d257891d49328fedff164ee6bfd WatchSource:0}: Error finding container 92898f7470c21cb4fb14a27b64968a7430217d257891d49328fedff164ee6bfd: Status 404 returned error can't find the container with id 92898f7470c21cb4fb14a27b64968a7430217d257891d49328fedff164ee6bfd Mar 11 19:21:32 crc kubenswrapper[4842]: I0311 19:21:32.972871 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e52316-b266-4eeb-8ffb-f410b7b278ef" path="/var/lib/kubelet/pods/62e52316-b266-4eeb-8ffb-f410b7b278ef/volumes" Mar 11 19:21:33 crc kubenswrapper[4842]: I0311 19:21:33.031470 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"722e1368-2318-4f0e-a6d3-199f26cccc14","Type":"ContainerStarted","Data":"92898f7470c21cb4fb14a27b64968a7430217d257891d49328fedff164ee6bfd"} Mar 11 19:21:34 crc kubenswrapper[4842]: I0311 19:21:34.042011 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"722e1368-2318-4f0e-a6d3-199f26cccc14","Type":"ContainerStarted","Data":"489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039"} Mar 11 19:21:34 crc kubenswrapper[4842]: I0311 19:21:34.065793 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.065773339 podStartE2EDuration="2.065773339s" podCreationTimestamp="2026-03-11 19:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:21:34.062595835 +0000 UTC m=+1939.710292125" watchObservedRunningTime="2026-03-11 19:21:34.065773339 +0000 UTC m=+1939.713469619" Mar 11 19:21:37 crc kubenswrapper[4842]: I0311 19:21:37.461027 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:39 crc kubenswrapper[4842]: I0311 19:21:39.408989 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:39 crc kubenswrapper[4842]: I0311 19:21:39.409081 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:39 crc kubenswrapper[4842]: I0311 19:21:39.731942 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:39 crc kubenswrapper[4842]: I0311 19:21:39.732037 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:40 crc kubenswrapper[4842]: I0311 19:21:40.449474 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.1.20:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:21:40 crc kubenswrapper[4842]: I0311 19:21:40.449474 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.1.20:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:21:40 crc kubenswrapper[4842]: I0311 19:21:40.814634 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.21:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:21:40 crc kubenswrapper[4842]: I0311 19:21:40.814638 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-log" probeResult="failure" output="Get \"http://10.217.1.21:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:21:42 crc kubenswrapper[4842]: I0311 19:21:42.461752 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:42 crc kubenswrapper[4842]: I0311 19:21:42.505724 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:43 crc kubenswrapper[4842]: I0311 19:21:43.186501 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:21:47 crc kubenswrapper[4842]: I0311 19:21:47.409532 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:47 crc kubenswrapper[4842]: I0311 19:21:47.410155 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:47 crc kubenswrapper[4842]: I0311 19:21:47.731503 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:47 crc kubenswrapper[4842]: I0311 19:21:47.731586 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:49 crc kubenswrapper[4842]: I0311 19:21:49.412582 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:49 crc kubenswrapper[4842]: I0311 19:21:49.412781 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:49 crc kubenswrapper[4842]: I0311 19:21:49.415366 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:49 crc kubenswrapper[4842]: I0311 19:21:49.416928 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:21:49 crc kubenswrapper[4842]: I0311 19:21:49.734058 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:49 crc kubenswrapper[4842]: I0311 19:21:49.736336 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:49 crc kubenswrapper[4842]: I0311 19:21:49.738448 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:21:50 crc kubenswrapper[4842]: I0311 19:21:50.210071 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.133149 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554282-lq6cv"] Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.134784 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554282-lq6cv" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.136845 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.138092 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.140216 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.144371 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554282-lq6cv"] Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.162784 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/828994bf-fa6e-4227-8b01-dc6d1b3b9cec-kube-api-access-qsht8\") pod \"auto-csr-approver-29554282-lq6cv\" (UID: \"828994bf-fa6e-4227-8b01-dc6d1b3b9cec\") " pod="openshift-infra/auto-csr-approver-29554282-lq6cv" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.264584 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/828994bf-fa6e-4227-8b01-dc6d1b3b9cec-kube-api-access-qsht8\") pod \"auto-csr-approver-29554282-lq6cv\" (UID: \"828994bf-fa6e-4227-8b01-dc6d1b3b9cec\") " pod="openshift-infra/auto-csr-approver-29554282-lq6cv" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.282043 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/828994bf-fa6e-4227-8b01-dc6d1b3b9cec-kube-api-access-qsht8\") pod \"auto-csr-approver-29554282-lq6cv\" (UID: \"828994bf-fa6e-4227-8b01-dc6d1b3b9cec\") " pod="openshift-infra/auto-csr-approver-29554282-lq6cv" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.478956 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554282-lq6cv" Mar 11 19:22:00 crc kubenswrapper[4842]: I0311 19:22:00.933783 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554282-lq6cv"] Mar 11 19:22:01 crc kubenswrapper[4842]: I0311 19:22:01.366228 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554282-lq6cv" event={"ID":"828994bf-fa6e-4227-8b01-dc6d1b3b9cec","Type":"ContainerStarted","Data":"840aec85c17a74302e63b7392ed2a03797ec200fcfcc631b6bc08ae5e92dd2a2"} Mar 11 19:22:03 crc kubenswrapper[4842]: I0311 19:22:03.388243 4842 generic.go:334] "Generic (PLEG): container finished" podID="828994bf-fa6e-4227-8b01-dc6d1b3b9cec" containerID="d13644760e036f5c2aeb72c6729f9ae6de24b5bc469328939bee3e7c981dba6e" exitCode=0 Mar 11 19:22:03 crc kubenswrapper[4842]: I0311 19:22:03.388347 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554282-lq6cv" event={"ID":"828994bf-fa6e-4227-8b01-dc6d1b3b9cec","Type":"ContainerDied","Data":"d13644760e036f5c2aeb72c6729f9ae6de24b5bc469328939bee3e7c981dba6e"} Mar 11 19:22:04 crc kubenswrapper[4842]: I0311 19:22:04.766539 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554282-lq6cv" Mar 11 19:22:04 crc kubenswrapper[4842]: I0311 19:22:04.943932 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/828994bf-fa6e-4227-8b01-dc6d1b3b9cec-kube-api-access-qsht8\") pod \"828994bf-fa6e-4227-8b01-dc6d1b3b9cec\" (UID: \"828994bf-fa6e-4227-8b01-dc6d1b3b9cec\") " Mar 11 19:22:04 crc kubenswrapper[4842]: I0311 19:22:04.948983 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828994bf-fa6e-4227-8b01-dc6d1b3b9cec-kube-api-access-qsht8" (OuterVolumeSpecName: "kube-api-access-qsht8") pod "828994bf-fa6e-4227-8b01-dc6d1b3b9cec" (UID: "828994bf-fa6e-4227-8b01-dc6d1b3b9cec"). InnerVolumeSpecName "kube-api-access-qsht8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:05 crc kubenswrapper[4842]: I0311 19:22:05.045585 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/828994bf-fa6e-4227-8b01-dc6d1b3b9cec-kube-api-access-qsht8\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:05 crc kubenswrapper[4842]: I0311 19:22:05.412849 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554282-lq6cv" event={"ID":"828994bf-fa6e-4227-8b01-dc6d1b3b9cec","Type":"ContainerDied","Data":"840aec85c17a74302e63b7392ed2a03797ec200fcfcc631b6bc08ae5e92dd2a2"} Mar 11 19:22:05 crc kubenswrapper[4842]: I0311 19:22:05.413177 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="840aec85c17a74302e63b7392ed2a03797ec200fcfcc631b6bc08ae5e92dd2a2" Mar 11 19:22:05 crc kubenswrapper[4842]: I0311 19:22:05.412924 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554282-lq6cv" Mar 11 19:22:05 crc kubenswrapper[4842]: I0311 19:22:05.832951 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554276-9xs5k"] Mar 11 19:22:05 crc kubenswrapper[4842]: I0311 19:22:05.840256 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554276-9xs5k"] Mar 11 19:22:06 crc kubenswrapper[4842]: I0311 19:22:06.971826 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99cb0c62-5098-459d-b8e2-290d05c30b60" path="/var/lib/kubelet/pods/99cb0c62-5098-459d-b8e2-290d05c30b60/volumes" Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.175960 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.176222 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="c808fee0-be92-4eae-9774-9d89393aacb9" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62" gracePeriod=30 Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.187492 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.187743 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" containerID="cri-o://ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" gracePeriod=30 Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.195848 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.196086 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="722e1368-2318-4f0e-a6d3-199f26cccc14" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039" gracePeriod=30 Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.346396 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.346650 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-log" containerID="cri-o://6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121" gracePeriod=30 Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.346770 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-api" containerID="cri-o://54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723" gracePeriod=30 Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.383821 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:08 crc kubenswrapper[4842]: I0311 19:22:08.384093 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="540ddacd-a44d-4c6b-b382-d43c70ca2470" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c" gracePeriod=30 Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.340732 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.456098 4842 generic.go:334] "Generic (PLEG): container finished" podID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerID="6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121" exitCode=143 Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.456182 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"71c193b9-90ca-4bba-9958-65c01e8a64e0","Type":"ContainerDied","Data":"6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121"} Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.457775 4842 generic.go:334] "Generic (PLEG): container finished" podID="540ddacd-a44d-4c6b-b382-d43c70ca2470" containerID="9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c" exitCode=0 Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.457814 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.457821 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"540ddacd-a44d-4c6b-b382-d43c70ca2470","Type":"ContainerDied","Data":"9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c"} Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.457874 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"540ddacd-a44d-4c6b-b382-d43c70ca2470","Type":"ContainerDied","Data":"a2042246210792a9cb3424ca5e105d34d99238abf3d7cb31f3dde171c7199e91"} Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.457893 4842 scope.go:117] "RemoveContainer" containerID="9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.484843 4842 scope.go:117] "RemoveContainer" containerID="9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c" Mar 11 19:22:09 crc kubenswrapper[4842]: E0311 19:22:09.485326 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c\": container with ID starting with 9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c not found: ID does not exist" containerID="9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.485377 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c"} err="failed to get container status \"9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c\": rpc error: code = NotFound desc = could not find container \"9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c\": container with ID starting with 9ba35165c4425e58f76d05b2c076f5b7752ac6577a66297a6adeb59e5de1e99c not found: ID does not exist" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.518193 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ddacd-a44d-4c6b-b382-d43c70ca2470-config-data\") pod \"540ddacd-a44d-4c6b-b382-d43c70ca2470\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.518295 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm6k9\" (UniqueName: \"kubernetes.io/projected/540ddacd-a44d-4c6b-b382-d43c70ca2470-kube-api-access-gm6k9\") pod \"540ddacd-a44d-4c6b-b382-d43c70ca2470\" (UID: \"540ddacd-a44d-4c6b-b382-d43c70ca2470\") " Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.523630 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540ddacd-a44d-4c6b-b382-d43c70ca2470-kube-api-access-gm6k9" (OuterVolumeSpecName: "kube-api-access-gm6k9") pod "540ddacd-a44d-4c6b-b382-d43c70ca2470" (UID: "540ddacd-a44d-4c6b-b382-d43c70ca2470"). InnerVolumeSpecName "kube-api-access-gm6k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.539189 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540ddacd-a44d-4c6b-b382-d43c70ca2470-config-data" (OuterVolumeSpecName: "config-data") pod "540ddacd-a44d-4c6b-b382-d43c70ca2470" (UID: "540ddacd-a44d-4c6b-b382-d43c70ca2470"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.619951 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/540ddacd-a44d-4c6b-b382-d43c70ca2470-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.619993 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm6k9\" (UniqueName: \"kubernetes.io/projected/540ddacd-a44d-4c6b-b382-d43c70ca2470-kube-api-access-gm6k9\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.788481 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.795955 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.815659 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:09 crc kubenswrapper[4842]: E0311 19:22:09.817553 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828994bf-fa6e-4227-8b01-dc6d1b3b9cec" containerName="oc" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.817612 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="828994bf-fa6e-4227-8b01-dc6d1b3b9cec" containerName="oc" Mar 11 19:22:09 crc kubenswrapper[4842]: E0311 19:22:09.817660 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540ddacd-a44d-4c6b-b382-d43c70ca2470" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.817668 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="540ddacd-a44d-4c6b-b382-d43c70ca2470" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.818721 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="828994bf-fa6e-4227-8b01-dc6d1b3b9cec" containerName="oc" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.818789 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="540ddacd-a44d-4c6b-b382-d43c70ca2470" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.821942 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.831068 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-conductor-config-data" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.851331 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.927567 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spkrc\" (UniqueName: \"kubernetes.io/projected/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-kube-api-access-spkrc\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:09 crc kubenswrapper[4842]: I0311 19:22:09.927757 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:10 crc kubenswrapper[4842]: I0311 19:22:10.029728 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spkrc\" (UniqueName: \"kubernetes.io/projected/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-kube-api-access-spkrc\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:10 crc kubenswrapper[4842]: I0311 19:22:10.029886 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:10 crc kubenswrapper[4842]: I0311 19:22:10.034358 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-config-data\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:10 crc kubenswrapper[4842]: I0311 19:22:10.049333 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spkrc\" (UniqueName: \"kubernetes.io/projected/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-kube-api-access-spkrc\") pod \"nova-kuttl-cell1-conductor-0\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:10 crc kubenswrapper[4842]: I0311 19:22:10.160020 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:10 crc kubenswrapper[4842]: I0311 19:22:10.598443 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:10 crc kubenswrapper[4842]: I0311 19:22:10.971594 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="540ddacd-a44d-4c6b-b382-d43c70ca2470" path="/var/lib/kubelet/pods/540ddacd-a44d-4c6b-b382-d43c70ca2470/volumes" Mar 11 19:22:11 crc kubenswrapper[4842]: E0311 19:22:11.439311 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:11 crc kubenswrapper[4842]: E0311 19:22:11.440626 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:11 crc kubenswrapper[4842]: E0311 19:22:11.441891 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:11 crc kubenswrapper[4842]: E0311 19:22:11.441987 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:22:11 crc kubenswrapper[4842]: I0311 19:22:11.489961 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1","Type":"ContainerStarted","Data":"a4a9008cebab810984a179f74dd3c1c4bc28e5dd045f9ce13ce6885b40bfa6ce"} Mar 11 19:22:11 crc kubenswrapper[4842]: I0311 19:22:11.490018 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1","Type":"ContainerStarted","Data":"348968470c193dd3e43e67205c5d97f8bb4f25b1fdf70f23464358fc8e783b0c"} Mar 11 19:22:11 crc kubenswrapper[4842]: I0311 19:22:11.491138 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:11 crc kubenswrapper[4842]: I0311 19:22:11.510862 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podStartSLOduration=2.510841904 podStartE2EDuration="2.510841904s" podCreationTimestamp="2026-03-11 19:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:22:11.507409773 +0000 UTC m=+1977.155106053" watchObservedRunningTime="2026-03-11 19:22:11.510841904 +0000 UTC m=+1977.158538184" Mar 11 19:22:11 crc kubenswrapper[4842]: I0311 19:22:11.949436 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.063520 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzs49\" (UniqueName: \"kubernetes.io/projected/71c193b9-90ca-4bba-9958-65c01e8a64e0-kube-api-access-vzs49\") pod \"71c193b9-90ca-4bba-9958-65c01e8a64e0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.063649 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c193b9-90ca-4bba-9958-65c01e8a64e0-config-data\") pod \"71c193b9-90ca-4bba-9958-65c01e8a64e0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.063858 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c193b9-90ca-4bba-9958-65c01e8a64e0-logs\") pod \"71c193b9-90ca-4bba-9958-65c01e8a64e0\" (UID: \"71c193b9-90ca-4bba-9958-65c01e8a64e0\") " Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.065035 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c193b9-90ca-4bba-9958-65c01e8a64e0-logs" (OuterVolumeSpecName: "logs") pod "71c193b9-90ca-4bba-9958-65c01e8a64e0" (UID: "71c193b9-90ca-4bba-9958-65c01e8a64e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.073699 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c193b9-90ca-4bba-9958-65c01e8a64e0-kube-api-access-vzs49" (OuterVolumeSpecName: "kube-api-access-vzs49") pod "71c193b9-90ca-4bba-9958-65c01e8a64e0" (UID: "71c193b9-90ca-4bba-9958-65c01e8a64e0"). InnerVolumeSpecName "kube-api-access-vzs49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.125283 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c193b9-90ca-4bba-9958-65c01e8a64e0-config-data" (OuterVolumeSpecName: "config-data") pod "71c193b9-90ca-4bba-9958-65c01e8a64e0" (UID: "71c193b9-90ca-4bba-9958-65c01e8a64e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.166089 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzs49\" (UniqueName: \"kubernetes.io/projected/71c193b9-90ca-4bba-9958-65c01e8a64e0-kube-api-access-vzs49\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.166122 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c193b9-90ca-4bba-9958-65c01e8a64e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.166132 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c193b9-90ca-4bba-9958-65c01e8a64e0-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.226739 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.368374 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722e1368-2318-4f0e-a6d3-199f26cccc14-config-data\") pod \"722e1368-2318-4f0e-a6d3-199f26cccc14\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.368569 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljq6r\" (UniqueName: \"kubernetes.io/projected/722e1368-2318-4f0e-a6d3-199f26cccc14-kube-api-access-ljq6r\") pod \"722e1368-2318-4f0e-a6d3-199f26cccc14\" (UID: \"722e1368-2318-4f0e-a6d3-199f26cccc14\") " Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.372754 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722e1368-2318-4f0e-a6d3-199f26cccc14-kube-api-access-ljq6r" (OuterVolumeSpecName: "kube-api-access-ljq6r") pod "722e1368-2318-4f0e-a6d3-199f26cccc14" (UID: "722e1368-2318-4f0e-a6d3-199f26cccc14"). InnerVolumeSpecName "kube-api-access-ljq6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.390685 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722e1368-2318-4f0e-a6d3-199f26cccc14-config-data" (OuterVolumeSpecName: "config-data") pod "722e1368-2318-4f0e-a6d3-199f26cccc14" (UID: "722e1368-2318-4f0e-a6d3-199f26cccc14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.472672 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722e1368-2318-4f0e-a6d3-199f26cccc14-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.472706 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljq6r\" (UniqueName: \"kubernetes.io/projected/722e1368-2318-4f0e-a6d3-199f26cccc14-kube-api-access-ljq6r\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.503088 4842 generic.go:334] "Generic (PLEG): container finished" podID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerID="54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723" exitCode=0 Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.503287 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"71c193b9-90ca-4bba-9958-65c01e8a64e0","Type":"ContainerDied","Data":"54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723"} Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.503346 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"71c193b9-90ca-4bba-9958-65c01e8a64e0","Type":"ContainerDied","Data":"603b37999beb56878105a8ea81f9443adfe3d336d596494241eaf660c7276751"} Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.503353 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.503368 4842 scope.go:117] "RemoveContainer" containerID="54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.505444 4842 generic.go:334] "Generic (PLEG): container finished" podID="722e1368-2318-4f0e-a6d3-199f26cccc14" containerID="489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039" exitCode=0 Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.507845 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.512027 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"722e1368-2318-4f0e-a6d3-199f26cccc14","Type":"ContainerDied","Data":"489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039"} Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.512070 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"722e1368-2318-4f0e-a6d3-199f26cccc14","Type":"ContainerDied","Data":"92898f7470c21cb4fb14a27b64968a7430217d257891d49328fedff164ee6bfd"} Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.521584 4842 scope.go:117] "RemoveContainer" containerID="6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.544931 4842 scope.go:117] "RemoveContainer" containerID="54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723" Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.545910 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723\": container with ID starting with 54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723 not found: ID does not exist" containerID="54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.545975 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723"} err="failed to get container status \"54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723\": rpc error: code = NotFound desc = could not find container \"54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723\": container with ID starting with 54ebaa1e446ca05c834ff3ebb3063f3be4de853659c501ad0ed16e858b157723 not found: ID does not exist" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.546004 4842 scope.go:117] "RemoveContainer" containerID="6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121" Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.546523 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121\": container with ID starting with 6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121 not found: ID does not exist" containerID="6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.546576 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121"} err="failed to get container status \"6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121\": rpc error: code = NotFound desc = could not find container \"6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121\": container with ID starting with 6b6205ed1a9684e239fac55dcc6ca0c4a5c070f602d9b864a8451655667fd121 not found: ID does not exist" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.546614 4842 scope.go:117] "RemoveContainer" containerID="489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.556643 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.581411 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.595678 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.596119 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-api" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.596140 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-api" Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.596161 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722e1368-2318-4f0e-a6d3-199f26cccc14" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.596170 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="722e1368-2318-4f0e-a6d3-199f26cccc14" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.596183 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-log" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.596191 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-log" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.596365 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-log" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.596383 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="722e1368-2318-4f0e-a6d3-199f26cccc14" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.596396 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" containerName="nova-kuttl-api-api" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.597345 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.599867 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-api-config-data" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.601126 4842 scope.go:117] "RemoveContainer" containerID="489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039" Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.602083 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039\": container with ID starting with 489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039 not found: ID does not exist" containerID="489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.602126 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039"} err="failed to get container status \"489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039\": rpc error: code = NotFound desc = could not find container \"489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039\": container with ID starting with 489c986dd89e8153f5343cdb5cac032ad98eaf9ef2f409b6e10e404f86167039 not found: ID does not exist" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.611118 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.624001 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.630388 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.639734 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.641024 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.642900 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-scheduler-config-data" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.649528 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.675290 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sfx9\" (UniqueName: \"kubernetes.io/projected/75caedc3-0ec9-4f3f-a381-8459fe9cad15-kube-api-access-8sfx9\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.675350 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75caedc3-0ec9-4f3f-a381-8459fe9cad15-logs\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.675420 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75caedc3-0ec9-4f3f-a381-8459fe9cad15-config-data\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.713504 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.714877 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.715847 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:22:12 crc kubenswrapper[4842]: E0311 19:22:12.715884 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="c808fee0-be92-4eae-9774-9d89393aacb9" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.776673 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dskz\" (UniqueName: \"kubernetes.io/projected/3fd5d983-467b-4c01-aec7-079a61880193-kube-api-access-5dskz\") pod \"nova-kuttl-scheduler-0\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.776762 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75caedc3-0ec9-4f3f-a381-8459fe9cad15-logs\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.776825 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75caedc3-0ec9-4f3f-a381-8459fe9cad15-config-data\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.776860 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd5d983-467b-4c01-aec7-079a61880193-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.776925 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sfx9\" (UniqueName: \"kubernetes.io/projected/75caedc3-0ec9-4f3f-a381-8459fe9cad15-kube-api-access-8sfx9\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.778125 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75caedc3-0ec9-4f3f-a381-8459fe9cad15-logs\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.784155 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75caedc3-0ec9-4f3f-a381-8459fe9cad15-config-data\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.792924 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sfx9\" (UniqueName: \"kubernetes.io/projected/75caedc3-0ec9-4f3f-a381-8459fe9cad15-kube-api-access-8sfx9\") pod \"nova-kuttl-api-0\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.879059 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd5d983-467b-4c01-aec7-079a61880193-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.879185 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dskz\" (UniqueName: \"kubernetes.io/projected/3fd5d983-467b-4c01-aec7-079a61880193-kube-api-access-5dskz\") pod \"nova-kuttl-scheduler-0\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.885047 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd5d983-467b-4c01-aec7-079a61880193-config-data\") pod \"nova-kuttl-scheduler-0\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.897756 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dskz\" (UniqueName: \"kubernetes.io/projected/3fd5d983-467b-4c01-aec7-079a61880193-kube-api-access-5dskz\") pod \"nova-kuttl-scheduler-0\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.965661 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.973353 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c193b9-90ca-4bba-9958-65c01e8a64e0" path="/var/lib/kubelet/pods/71c193b9-90ca-4bba-9958-65c01e8a64e0/volumes" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.973402 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:12 crc kubenswrapper[4842]: I0311 19:22:12.974052 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722e1368-2318-4f0e-a6d3-199f26cccc14" path="/var/lib/kubelet/pods/722e1368-2318-4f0e-a6d3-199f26cccc14/volumes" Mar 11 19:22:13 crc kubenswrapper[4842]: I0311 19:22:13.426499 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:13 crc kubenswrapper[4842]: W0311 19:22:13.427260 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fd5d983_467b_4c01_aec7_079a61880193.slice/crio-5bc9f33fc9d00826b9790a5a91fbd5a288ae9d5a9661503c1b1cd04a120e6025 WatchSource:0}: Error finding container 5bc9f33fc9d00826b9790a5a91fbd5a288ae9d5a9661503c1b1cd04a120e6025: Status 404 returned error can't find the container with id 5bc9f33fc9d00826b9790a5a91fbd5a288ae9d5a9661503c1b1cd04a120e6025 Mar 11 19:22:13 crc kubenswrapper[4842]: I0311 19:22:13.516957 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"3fd5d983-467b-4c01-aec7-079a61880193","Type":"ContainerStarted","Data":"5bc9f33fc9d00826b9790a5a91fbd5a288ae9d5a9661503c1b1cd04a120e6025"} Mar 11 19:22:13 crc kubenswrapper[4842]: W0311 19:22:13.546469 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75caedc3_0ec9_4f3f_a381_8459fe9cad15.slice/crio-96aeec149283d8a185f8370a9b84ae82c22eca755733ce82ef95851639c4582c WatchSource:0}: Error finding container 96aeec149283d8a185f8370a9b84ae82c22eca755733ce82ef95851639c4582c: Status 404 returned error can't find the container with id 96aeec149283d8a185f8370a9b84ae82c22eca755733ce82ef95851639c4582c Mar 11 19:22:13 crc kubenswrapper[4842]: I0311 19:22:13.547629 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:14 crc kubenswrapper[4842]: I0311 19:22:14.525952 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"75caedc3-0ec9-4f3f-a381-8459fe9cad15","Type":"ContainerStarted","Data":"870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b"} Mar 11 19:22:14 crc kubenswrapper[4842]: I0311 19:22:14.525996 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"75caedc3-0ec9-4f3f-a381-8459fe9cad15","Type":"ContainerStarted","Data":"478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80"} Mar 11 19:22:14 crc kubenswrapper[4842]: I0311 19:22:14.526007 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"75caedc3-0ec9-4f3f-a381-8459fe9cad15","Type":"ContainerStarted","Data":"96aeec149283d8a185f8370a9b84ae82c22eca755733ce82ef95851639c4582c"} Mar 11 19:22:14 crc kubenswrapper[4842]: I0311 19:22:14.527826 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"3fd5d983-467b-4c01-aec7-079a61880193","Type":"ContainerStarted","Data":"cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b"} Mar 11 19:22:14 crc kubenswrapper[4842]: I0311 19:22:14.543975 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-api-0" podStartSLOduration=2.543947159 podStartE2EDuration="2.543947159s" podCreationTimestamp="2026-03-11 19:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:22:14.539967014 +0000 UTC m=+1980.187663314" watchObservedRunningTime="2026-03-11 19:22:14.543947159 +0000 UTC m=+1980.191643439" Mar 11 19:22:14 crc kubenswrapper[4842]: I0311 19:22:14.561380 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podStartSLOduration=2.56136163 podStartE2EDuration="2.56136163s" podCreationTimestamp="2026-03-11 19:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:22:14.555950257 +0000 UTC m=+1980.203646537" watchObservedRunningTime="2026-03-11 19:22:14.56136163 +0000 UTC m=+1980.209057910" Mar 11 19:22:15 crc kubenswrapper[4842]: I0311 19:22:15.196149 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:16 crc kubenswrapper[4842]: E0311 19:22:16.437897 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:16 crc kubenswrapper[4842]: E0311 19:22:16.439441 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:16 crc kubenswrapper[4842]: E0311 19:22:16.440889 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:16 crc kubenswrapper[4842]: E0311 19:22:16.440934 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.549012 4842 generic.go:334] "Generic (PLEG): container finished" podID="c808fee0-be92-4eae-9774-9d89393aacb9" containerID="36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62" exitCode=0 Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.549115 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"c808fee0-be92-4eae-9774-9d89393aacb9","Type":"ContainerDied","Data":"36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62"} Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.549177 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"c808fee0-be92-4eae-9774-9d89393aacb9","Type":"ContainerDied","Data":"6dcd4ad61a322b007b7b9ffd501f4683e371c73ff7bb155f540ab43bc784ba2b"} Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.549192 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dcd4ad61a322b007b7b9ffd501f4683e371c73ff7bb155f540ab43bc784ba2b" Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.574171 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.646940 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c808fee0-be92-4eae-9774-9d89393aacb9-config-data\") pod \"c808fee0-be92-4eae-9774-9d89393aacb9\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.647057 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxw4c\" (UniqueName: \"kubernetes.io/projected/c808fee0-be92-4eae-9774-9d89393aacb9-kube-api-access-sxw4c\") pod \"c808fee0-be92-4eae-9774-9d89393aacb9\" (UID: \"c808fee0-be92-4eae-9774-9d89393aacb9\") " Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.654015 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c808fee0-be92-4eae-9774-9d89393aacb9-kube-api-access-sxw4c" (OuterVolumeSpecName: "kube-api-access-sxw4c") pod "c808fee0-be92-4eae-9774-9d89393aacb9" (UID: "c808fee0-be92-4eae-9774-9d89393aacb9"). InnerVolumeSpecName "kube-api-access-sxw4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.667947 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c808fee0-be92-4eae-9774-9d89393aacb9-config-data" (OuterVolumeSpecName: "config-data") pod "c808fee0-be92-4eae-9774-9d89393aacb9" (UID: "c808fee0-be92-4eae-9774-9d89393aacb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.749009 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c808fee0-be92-4eae-9774-9d89393aacb9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:16 crc kubenswrapper[4842]: I0311 19:22:16.749049 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxw4c\" (UniqueName: \"kubernetes.io/projected/c808fee0-be92-4eae-9774-9d89393aacb9-kube-api-access-sxw4c\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.557499 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.577622 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.585843 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.603979 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:17 crc kubenswrapper[4842]: E0311 19:22:17.605967 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c808fee0-be92-4eae-9774-9d89393aacb9" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.606059 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="c808fee0-be92-4eae-9774-9d89393aacb9" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.606322 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="c808fee0-be92-4eae-9774-9d89393aacb9" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.606990 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.609430 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell0-conductor-config-data" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.624188 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.664321 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ft5\" (UniqueName: \"kubernetes.io/projected/f574839a-5168-492d-a22b-ecfbed63b274-kube-api-access-r7ft5\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.664394 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f574839a-5168-492d-a22b-ecfbed63b274-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.766377 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ft5\" (UniqueName: \"kubernetes.io/projected/f574839a-5168-492d-a22b-ecfbed63b274-kube-api-access-r7ft5\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.766442 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f574839a-5168-492d-a22b-ecfbed63b274-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.773002 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f574839a-5168-492d-a22b-ecfbed63b274-config-data\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.782183 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ft5\" (UniqueName: \"kubernetes.io/projected/f574839a-5168-492d-a22b-ecfbed63b274-kube-api-access-r7ft5\") pod \"nova-kuttl-cell0-conductor-0\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.925226 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:17 crc kubenswrapper[4842]: I0311 19:22:17.974746 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:18 crc kubenswrapper[4842]: I0311 19:22:18.258400 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:18 crc kubenswrapper[4842]: W0311 19:22:18.286494 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf574839a_5168_492d_a22b_ecfbed63b274.slice/crio-3228f88e8d951f473d152f9f48caa493a1640c7e26698c5e1d6dcba50ce26578 WatchSource:0}: Error finding container 3228f88e8d951f473d152f9f48caa493a1640c7e26698c5e1d6dcba50ce26578: Status 404 returned error can't find the container with id 3228f88e8d951f473d152f9f48caa493a1640c7e26698c5e1d6dcba50ce26578 Mar 11 19:22:18 crc kubenswrapper[4842]: I0311 19:22:18.568685 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"f574839a-5168-492d-a22b-ecfbed63b274","Type":"ContainerStarted","Data":"2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8"} Mar 11 19:22:18 crc kubenswrapper[4842]: I0311 19:22:18.568732 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"f574839a-5168-492d-a22b-ecfbed63b274","Type":"ContainerStarted","Data":"3228f88e8d951f473d152f9f48caa493a1640c7e26698c5e1d6dcba50ce26578"} Mar 11 19:22:18 crc kubenswrapper[4842]: I0311 19:22:18.568852 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:18 crc kubenswrapper[4842]: I0311 19:22:18.586500 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podStartSLOduration=1.5864804449999999 podStartE2EDuration="1.586480445s" podCreationTimestamp="2026-03-11 19:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:22:18.583910227 +0000 UTC m=+1984.231606507" watchObservedRunningTime="2026-03-11 19:22:18.586480445 +0000 UTC m=+1984.234176725" Mar 11 19:22:18 crc kubenswrapper[4842]: I0311 19:22:18.971767 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c808fee0-be92-4eae-9774-9d89393aacb9" path="/var/lib/kubelet/pods/c808fee0-be92-4eae-9774-9d89393aacb9/volumes" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.249206 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.294246 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-config-data\") pod \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.294443 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km5s7\" (UniqueName: \"kubernetes.io/projected/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-kube-api-access-km5s7\") pod \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\" (UID: \"457eeec2-b96e-4bb3-9087-3c73cb0c96c9\") " Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.309890 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-kube-api-access-km5s7" (OuterVolumeSpecName: "kube-api-access-km5s7") pod "457eeec2-b96e-4bb3-9087-3c73cb0c96c9" (UID: "457eeec2-b96e-4bb3-9087-3c73cb0c96c9"). InnerVolumeSpecName "kube-api-access-km5s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.321768 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-config-data" (OuterVolumeSpecName: "config-data") pod "457eeec2-b96e-4bb3-9087-3c73cb0c96c9" (UID: "457eeec2-b96e-4bb3-9087-3c73cb0c96c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.397266 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.397317 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km5s7\" (UniqueName: \"kubernetes.io/projected/457eeec2-b96e-4bb3-9087-3c73cb0c96c9-kube-api-access-km5s7\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.578510 4842 generic.go:334] "Generic (PLEG): container finished" podID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" exitCode=0 Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.578668 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.578869 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"457eeec2-b96e-4bb3-9087-3c73cb0c96c9","Type":"ContainerDied","Data":"ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54"} Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.578915 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"457eeec2-b96e-4bb3-9087-3c73cb0c96c9","Type":"ContainerDied","Data":"56816025f7b16028f0e1c483ab467d61d48cbaa0e7c53a280cbf168d7619a0f3"} Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.578933 4842 scope.go:117] "RemoveContainer" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.608658 4842 scope.go:117] "RemoveContainer" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" Mar 11 19:22:19 crc kubenswrapper[4842]: E0311 19:22:19.609066 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54\": container with ID starting with ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54 not found: ID does not exist" containerID="ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.609111 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54"} err="failed to get container status \"ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54\": rpc error: code = NotFound desc = could not find container \"ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54\": container with ID starting with ac49105b774ddd73e7fdaaf21708c642c5bc14fa75fba486eb897f175ef53a54 not found: ID does not exist" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.613824 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.619159 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.637445 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:22:19 crc kubenswrapper[4842]: E0311 19:22:19.637964 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.637989 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.638197 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.638967 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.641217 4842 reflector.go:368] Caches populated for *v1.Secret from object-"nova-kuttl-default"/"nova-kuttl-cell1-compute-fake1-compute-config-data" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.644694 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.705688 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.705965 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9q64\" (UniqueName: \"kubernetes.io/projected/257b074b-1b06-46f4-8120-1e377849955c-kube-api-access-w9q64\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.807834 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9q64\" (UniqueName: \"kubernetes.io/projected/257b074b-1b06-46f4-8120-1e377849955c-kube-api-access-w9q64\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.807975 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.812501 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.824974 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9q64\" (UniqueName: \"kubernetes.io/projected/257b074b-1b06-46f4-8120-1e377849955c-kube-api-access-w9q64\") pod \"nova-kuttl-cell1-compute-fake1-compute-0\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:19 crc kubenswrapper[4842]: I0311 19:22:19.997336 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:20 crc kubenswrapper[4842]: I0311 19:22:20.419761 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:22:20 crc kubenswrapper[4842]: W0311 19:22:20.429236 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod257b074b_1b06_46f4_8120_1e377849955c.slice/crio-2475dffb3384e6aae3ed504b4d1993ae6d0ddf3500002e7725ffab137c554a04 WatchSource:0}: Error finding container 2475dffb3384e6aae3ed504b4d1993ae6d0ddf3500002e7725ffab137c554a04: Status 404 returned error can't find the container with id 2475dffb3384e6aae3ed504b4d1993ae6d0ddf3500002e7725ffab137c554a04 Mar 11 19:22:20 crc kubenswrapper[4842]: I0311 19:22:20.588373 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerStarted","Data":"2475dffb3384e6aae3ed504b4d1993ae6d0ddf3500002e7725ffab137c554a04"} Mar 11 19:22:20 crc kubenswrapper[4842]: I0311 19:22:20.972606 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="457eeec2-b96e-4bb3-9087-3c73cb0c96c9" path="/var/lib/kubelet/pods/457eeec2-b96e-4bb3-9087-3c73cb0c96c9/volumes" Mar 11 19:22:21 crc kubenswrapper[4842]: I0311 19:22:21.599125 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerStarted","Data":"54c961de2b9e7df561e6418cc1c5ae221a63262e30aa752b471889119c1a11d9"} Mar 11 19:22:21 crc kubenswrapper[4842]: I0311 19:22:21.599697 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:21 crc kubenswrapper[4842]: I0311 19:22:21.629186 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:21 crc kubenswrapper[4842]: I0311 19:22:21.643872 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podStartSLOduration=2.643843313 podStartE2EDuration="2.643843313s" podCreationTimestamp="2026-03-11 19:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:22:21.625466126 +0000 UTC m=+1987.273162446" watchObservedRunningTime="2026-03-11 19:22:21.643843313 +0000 UTC m=+1987.291539603" Mar 11 19:22:22 crc kubenswrapper[4842]: I0311 19:22:22.972354 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:22 crc kubenswrapper[4842]: I0311 19:22:22.972399 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:22 crc kubenswrapper[4842]: I0311 19:22:22.974033 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:23 crc kubenswrapper[4842]: I0311 19:22:23.006879 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:23 crc kubenswrapper[4842]: I0311 19:22:23.642500 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:24 crc kubenswrapper[4842]: I0311 19:22:24.048537 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.1.25:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:22:24 crc kubenswrapper[4842]: I0311 19:22:24.048569 4842 prober.go:107] "Probe failed" probeType="Startup" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.1.25:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:22:24 crc kubenswrapper[4842]: I0311 19:22:24.629405 4842 generic.go:334] "Generic (PLEG): container finished" podID="257b074b-1b06-46f4-8120-1e377849955c" containerID="54c961de2b9e7df561e6418cc1c5ae221a63262e30aa752b471889119c1a11d9" exitCode=0 Mar 11 19:22:24 crc kubenswrapper[4842]: I0311 19:22:24.629506 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerDied","Data":"54c961de2b9e7df561e6418cc1c5ae221a63262e30aa752b471889119c1a11d9"} Mar 11 19:22:24 crc kubenswrapper[4842]: I0311 19:22:24.630100 4842 scope.go:117] "RemoveContainer" containerID="54c961de2b9e7df561e6418cc1c5ae221a63262e30aa752b471889119c1a11d9" Mar 11 19:22:24 crc kubenswrapper[4842]: I0311 19:22:24.997746 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:25 crc kubenswrapper[4842]: I0311 19:22:25.644591 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerStarted","Data":"b728604be5eb64805c476c48c2cbe396364e3e38990b7e6a9da6666c3dd8648b"} Mar 11 19:22:25 crc kubenswrapper[4842]: I0311 19:22:25.645125 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:25 crc kubenswrapper[4842]: I0311 19:22:25.672788 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:27 crc kubenswrapper[4842]: I0311 19:22:27.951582 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:28 crc kubenswrapper[4842]: I0311 19:22:28.672117 4842 generic.go:334] "Generic (PLEG): container finished" podID="257b074b-1b06-46f4-8120-1e377849955c" containerID="b728604be5eb64805c476c48c2cbe396364e3e38990b7e6a9da6666c3dd8648b" exitCode=0 Mar 11 19:22:28 crc kubenswrapper[4842]: I0311 19:22:28.672162 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerDied","Data":"b728604be5eb64805c476c48c2cbe396364e3e38990b7e6a9da6666c3dd8648b"} Mar 11 19:22:28 crc kubenswrapper[4842]: I0311 19:22:28.672199 4842 scope.go:117] "RemoveContainer" containerID="54c961de2b9e7df561e6418cc1c5ae221a63262e30aa752b471889119c1a11d9" Mar 11 19:22:28 crc kubenswrapper[4842]: I0311 19:22:28.672817 4842 scope.go:117] "RemoveContainer" containerID="b728604be5eb64805c476c48c2cbe396364e3e38990b7e6a9da6666c3dd8648b" Mar 11 19:22:28 crc kubenswrapper[4842]: E0311 19:22:28.673023 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-kuttl-cell1-compute-fake1-compute-compute\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-kuttl-cell1-compute-fake1-compute-compute pod=nova-kuttl-cell1-compute-fake1-compute-0_nova-kuttl-default(257b074b-1b06-46f4-8120-1e377849955c)\"" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" Mar 11 19:22:29 crc kubenswrapper[4842]: I0311 19:22:29.997753 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:29 crc kubenswrapper[4842]: I0311 19:22:29.997811 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:29 crc kubenswrapper[4842]: I0311 19:22:29.998490 4842 scope.go:117] "RemoveContainer" containerID="b728604be5eb64805c476c48c2cbe396364e3e38990b7e6a9da6666c3dd8648b" Mar 11 19:22:29 crc kubenswrapper[4842]: E0311 19:22:29.998732 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-kuttl-cell1-compute-fake1-compute-compute\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-kuttl-cell1-compute-fake1-compute-compute pod=nova-kuttl-cell1-compute-fake1-compute-0_nova-kuttl-default(257b074b-1b06-46f4-8120-1e377849955c)\"" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" Mar 11 19:22:30 crc kubenswrapper[4842]: I0311 19:22:30.971056 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:30 crc kubenswrapper[4842]: I0311 19:22:30.971098 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:31 crc kubenswrapper[4842]: I0311 19:22:31.115010 4842 scope.go:117] "RemoveContainer" containerID="69dc0bc64cdcdcb52600f26a1d6456519d9859ba5deb19d9d6b4a2068d54b12e" Mar 11 19:22:31 crc kubenswrapper[4842]: I0311 19:22:31.472397 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:22:31 crc kubenswrapper[4842]: I0311 19:22:31.472803 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:22:32 crc kubenswrapper[4842]: I0311 19:22:32.974679 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:32 crc kubenswrapper[4842]: I0311 19:22:32.974751 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:32 crc kubenswrapper[4842]: I0311 19:22:32.981556 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:32 crc kubenswrapper[4842]: I0311 19:22:32.983370 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:43 crc kubenswrapper[4842]: I0311 19:22:43.962563 4842 scope.go:117] "RemoveContainer" containerID="b728604be5eb64805c476c48c2cbe396364e3e38990b7e6a9da6666c3dd8648b" Mar 11 19:22:44 crc kubenswrapper[4842]: I0311 19:22:44.827186 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerStarted","Data":"fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0"} Mar 11 19:22:44 crc kubenswrapper[4842]: I0311 19:22:44.828084 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:44 crc kubenswrapper[4842]: I0311 19:22:44.862253 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.340153 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.350256 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-cell-mapping-d5zzc"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.359699 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.368002 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-host-discover-5bw4c"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.377614 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.386389 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-cell-mapping-gv7hw"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.394743 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.444665 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell0187d-account-delete-hghj2"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.445727 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.464853 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell0187d-account-delete-hghj2"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.482345 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.482560 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="3fd5d983-467b-4c01-aec7-079a61880193" containerName="nova-kuttl-scheduler-scheduler" containerID="cri-o://cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.550788 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.551211 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-log" containerID="cri-o://46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.551518 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-metadata-0" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-metadata" containerID="cri-o://4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.560959 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp56s\" (UniqueName: \"kubernetes.io/projected/7229ac46-328f-479a-8ffa-3e28680ccabc-kube-api-access-hp56s\") pod \"novacell0187d-account-delete-hghj2\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.561064 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7229ac46-328f-479a-8ffa-3e28680ccabc-operator-scripts\") pod \"novacell0187d-account-delete-hghj2\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.600663 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novacell173b3-account-delete-5xnf6"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.603725 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.615280 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell173b3-account-delete-5xnf6"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.635820 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["nova-kuttl-default/novaapid894-account-delete-g8k4l"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.637236 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.662145 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapid894-account-delete-g8k4l"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.663027 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp56s\" (UniqueName: \"kubernetes.io/projected/7229ac46-328f-479a-8ffa-3e28680ccabc-kube-api-access-hp56s\") pod \"novacell0187d-account-delete-hghj2\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.663104 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7229ac46-328f-479a-8ffa-3e28680ccabc-operator-scripts\") pod \"novacell0187d-account-delete-hghj2\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.676486 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7229ac46-328f-479a-8ffa-3e28680ccabc-operator-scripts\") pod \"novacell0187d-account-delete-hghj2\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.681670 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.682004 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="fa949ad8-639a-4fc1-b4ae-b021fd3bd425" containerName="nova-kuttl-cell1-novncproxy-novncproxy" containerID="cri-o://4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.704703 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp56s\" (UniqueName: \"kubernetes.io/projected/7229ac46-328f-479a-8ffa-3e28680ccabc-kube-api-access-hp56s\") pod \"novacell0187d-account-delete-hghj2\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.723626 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.724190 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-api" containerID="cri-o://870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.724125 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-log" containerID="cri-o://478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.752742 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.753148 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="f574839a-5168-492d-a22b-ecfbed63b274" containerName="nova-kuttl-cell0-conductor-conductor" containerID="cri-o://2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.762569 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.763972 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7692542-b7f6-4720-a4d7-46839a09792d-operator-scripts\") pod \"novaapid894-account-delete-g8k4l\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.764063 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh6l7\" (UniqueName: \"kubernetes.io/projected/d7692542-b7f6-4720-a4d7-46839a09792d-kube-api-access-gh6l7\") pod \"novaapid894-account-delete-g8k4l\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.764120 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e539e1f-8358-4179-8da7-edfd18eb6537-operator-scripts\") pod \"novacell173b3-account-delete-5xnf6\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.764173 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc99k\" (UniqueName: \"kubernetes.io/projected/8e539e1f-8358-4179-8da7-edfd18eb6537-kube-api-access-kc99k\") pod \"novacell173b3-account-delete-5xnf6\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.773325 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.778141 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-db-sync-kb8x4"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.787225 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.792305 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-db-sync-xzndf"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.798464 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.798686 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" podUID="b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" containerName="nova-kuttl-cell1-conductor-conductor" containerID="cri-o://a4a9008cebab810984a179f74dd3c1c4bc28e5dd045f9ce13ce6885b40bfa6ce" gracePeriod=30 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.873130 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7692542-b7f6-4720-a4d7-46839a09792d-operator-scripts\") pod \"novaapid894-account-delete-g8k4l\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.871213 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7692542-b7f6-4720-a4d7-46839a09792d-operator-scripts\") pod \"novaapid894-account-delete-g8k4l\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.873545 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh6l7\" (UniqueName: \"kubernetes.io/projected/d7692542-b7f6-4720-a4d7-46839a09792d-kube-api-access-gh6l7\") pod \"novaapid894-account-delete-g8k4l\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.876009 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e539e1f-8358-4179-8da7-edfd18eb6537-operator-scripts\") pod \"novacell173b3-account-delete-5xnf6\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.876134 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc99k\" (UniqueName: \"kubernetes.io/projected/8e539e1f-8358-4179-8da7-edfd18eb6537-kube-api-access-kc99k\") pod \"novacell173b3-account-delete-5xnf6\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.880647 4842 generic.go:334] "Generic (PLEG): container finished" podID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerID="46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7" exitCode=143 Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.881292 4842 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" secret="" err="secret \"nova-nova-kuttl-dockercfg-pf4x2\" not found" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.881334 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"d786c50e-6bfb-4b96-bec9-cfa618ab848a","Type":"ContainerDied","Data":"46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7"} Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.885248 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e539e1f-8358-4179-8da7-edfd18eb6537-operator-scripts\") pod \"novacell173b3-account-delete-5xnf6\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.892491 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh6l7\" (UniqueName: \"kubernetes.io/projected/d7692542-b7f6-4720-a4d7-46839a09792d-kube-api-access-gh6l7\") pod \"novaapid894-account-delete-g8k4l\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.898771 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc99k\" (UniqueName: \"kubernetes.io/projected/8e539e1f-8358-4179-8da7-edfd18eb6537-kube-api-access-kc99k\") pod \"novacell173b3-account-delete-5xnf6\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: I0311 19:22:45.923617 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:45 crc kubenswrapper[4842]: E0311 19:22:45.978282 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-config-data: secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:45 crc kubenswrapper[4842]: E0311 19:22:45.978897 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data podName:257b074b-1b06-46f4-8120-1e377849955c nodeName:}" failed. No retries permitted until 2026-03-11 19:22:46.478868928 +0000 UTC m=+2012.126565208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data") pod "nova-kuttl-cell1-compute-fake1-compute-0" (UID: "257b074b-1b06-46f4-8120-1e377849955c") : secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.046433 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.315770 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell0187d-account-delete-hghj2"] Mar 11 19:22:46 crc kubenswrapper[4842]: E0311 19:22:46.488115 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-config-data: secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.488179 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novacell173b3-account-delete-5xnf6"] Mar 11 19:22:46 crc kubenswrapper[4842]: E0311 19:22:46.488229 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data podName:257b074b-1b06-46f4-8120-1e377849955c nodeName:}" failed. No retries permitted until 2026-03-11 19:22:47.488199399 +0000 UTC m=+2013.135895679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data") pod "nova-kuttl-cell1-compute-fake1-compute-0" (UID: "257b074b-1b06-46f4-8120-1e377849955c") : secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.601676 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["nova-kuttl-default/novaapid894-account-delete-g8k4l"] Mar 11 19:22:46 crc kubenswrapper[4842]: W0311 19:22:46.607558 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7692542_b7f6_4720_a4d7_46839a09792d.slice/crio-c7ef68a85a9ea6198320fca786f1a9caf61c11fae56d7e279a6a89ce9c18a738 WatchSource:0}: Error finding container c7ef68a85a9ea6198320fca786f1a9caf61c11fae56d7e279a6a89ce9c18a738: Status 404 returned error can't find the container with id c7ef68a85a9ea6198320fca786f1a9caf61c11fae56d7e279a6a89ce9c18a738 Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.659825 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.791957 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh9bz\" (UniqueName: \"kubernetes.io/projected/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-kube-api-access-rh9bz\") pod \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.792499 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-config-data\") pod \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\" (UID: \"fa949ad8-639a-4fc1-b4ae-b021fd3bd425\") " Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.801483 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-kube-api-access-rh9bz" (OuterVolumeSpecName: "kube-api-access-rh9bz") pod "fa949ad8-639a-4fc1-b4ae-b021fd3bd425" (UID: "fa949ad8-639a-4fc1-b4ae-b021fd3bd425"). InnerVolumeSpecName "kube-api-access-rh9bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.833906 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-config-data" (OuterVolumeSpecName: "config-data") pod "fa949ad8-639a-4fc1-b4ae-b021fd3bd425" (UID: "fa949ad8-639a-4fc1-b4ae-b021fd3bd425"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.890206 4842 generic.go:334] "Generic (PLEG): container finished" podID="fa949ad8-639a-4fc1-b4ae-b021fd3bd425" containerID="4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f" exitCode=0 Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.890294 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"fa949ad8-639a-4fc1-b4ae-b021fd3bd425","Type":"ContainerDied","Data":"4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.890328 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" event={"ID":"fa949ad8-639a-4fc1-b4ae-b021fd3bd425","Type":"ContainerDied","Data":"9811db1191c75507fa9b357560876cba6ca5c97156eb4778b357c88f43d62d99"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.890344 4842 scope.go:117] "RemoveContainer" containerID="4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.890408 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.892525 4842 generic.go:334] "Generic (PLEG): container finished" podID="8e539e1f-8358-4179-8da7-edfd18eb6537" containerID="2aea09330816f211682dc660c7039ee0c2b32e7b3dd1c462dd23655d4aecc2e3" exitCode=0 Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.892596 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" event={"ID":"8e539e1f-8358-4179-8da7-edfd18eb6537","Type":"ContainerDied","Data":"2aea09330816f211682dc660c7039ee0c2b32e7b3dd1c462dd23655d4aecc2e3"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.892629 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" event={"ID":"8e539e1f-8358-4179-8da7-edfd18eb6537","Type":"ContainerStarted","Data":"e3de324f2c7cd9781ae555c78f08e5a57c625ea946a9dc5810efe029811f763d"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.894412 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.894435 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh9bz\" (UniqueName: \"kubernetes.io/projected/fa949ad8-639a-4fc1-b4ae-b021fd3bd425-kube-api-access-rh9bz\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.895748 4842 generic.go:334] "Generic (PLEG): container finished" podID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerID="478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80" exitCode=143 Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.895796 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"75caedc3-0ec9-4f3f-a381-8459fe9cad15","Type":"ContainerDied","Data":"478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.897066 4842 generic.go:334] "Generic (PLEG): container finished" podID="7229ac46-328f-479a-8ffa-3e28680ccabc" containerID="b01b8cf1f8126371cd36db897da5cf7c85e239890ae9badcaa7e67558892f64e" exitCode=0 Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.897114 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" event={"ID":"7229ac46-328f-479a-8ffa-3e28680ccabc","Type":"ContainerDied","Data":"b01b8cf1f8126371cd36db897da5cf7c85e239890ae9badcaa7e67558892f64e"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.897129 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" event={"ID":"7229ac46-328f-479a-8ffa-3e28680ccabc","Type":"ContainerStarted","Data":"fb00b595f17edfd130dd2c7bbee7d8b44eb8e2af0c974804bdb790e62217928f"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.901786 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" containerID="cri-o://fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" gracePeriod=30 Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.902822 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" event={"ID":"d7692542-b7f6-4720-a4d7-46839a09792d","Type":"ContainerStarted","Data":"1854ee38df937e7f30868ec7405a09f697d8be7cf595110682c45572dd57640d"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.902849 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" event={"ID":"d7692542-b7f6-4720-a4d7-46839a09792d","Type":"ContainerStarted","Data":"c7ef68a85a9ea6198320fca786f1a9caf61c11fae56d7e279a6a89ce9c18a738"} Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.942068 4842 scope.go:117] "RemoveContainer" containerID="4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f" Mar 11 19:22:46 crc kubenswrapper[4842]: E0311 19:22:46.942605 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f\": container with ID starting with 4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f not found: ID does not exist" containerID="4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.942640 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f"} err="failed to get container status \"4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f\": rpc error: code = NotFound desc = could not find container \"4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f\": container with ID starting with 4b4c119cc4d58108ed68bc1bf517265accacad7feb76019dc1781624adf9d37f not found: ID does not exist" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.953113 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" podStartSLOduration=1.953094455 podStartE2EDuration="1.953094455s" podCreationTimestamp="2026-03-11 19:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:22:46.945657018 +0000 UTC m=+2012.593353298" watchObservedRunningTime="2026-03-11 19:22:46.953094455 +0000 UTC m=+2012.600790735" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.972514 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406febbd-9625-4fa7-a281-0bb7c2a4fb19" path="/var/lib/kubelet/pods/406febbd-9625-4fa7-a281-0bb7c2a4fb19/volumes" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.973197 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad288058-ff46-425c-a5e2-4313ed4e2688" path="/var/lib/kubelet/pods/ad288058-ff46-425c-a5e2-4313ed4e2688/volumes" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.974535 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b512fa33-0320-4516-b999-738699cd428b" path="/var/lib/kubelet/pods/b512fa33-0320-4516-b999-738699cd428b/volumes" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.976174 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c689843a-7097-40e5-a6dc-45b0fff3f1f9" path="/var/lib/kubelet/pods/c689843a-7097-40e5-a6dc-45b0fff3f1f9/volumes" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.977041 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7588b4a-b06c-4e85-a2db-4750cb57d53f" path="/var/lib/kubelet/pods/e7588b4a-b06c-4e85-a2db-4750cb57d53f/volumes" Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.977742 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:22:46 crc kubenswrapper[4842]: I0311 19:22:46.980956 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-novncproxy-0"] Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.509982 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-config-data: secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.510390 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data podName:257b074b-1b06-46f4-8120-1e377849955c nodeName:}" failed. No retries permitted until 2026-03-11 19:22:49.510371376 +0000 UTC m=+2015.158067656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data") pod "nova-kuttl-cell1-compute-fake1-compute-0" (UID: "257b074b-1b06-46f4-8120-1e377849955c") : secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.940320 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:22:47 crc kubenswrapper[4842]: I0311 19:22:47.947803 4842 generic.go:334] "Generic (PLEG): container finished" podID="b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" containerID="a4a9008cebab810984a179f74dd3c1c4bc28e5dd045f9ce13ce6885b40bfa6ce" exitCode=0 Mar 11 19:22:47 crc kubenswrapper[4842]: I0311 19:22:47.947874 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1","Type":"ContainerDied","Data":"a4a9008cebab810984a179f74dd3c1c4bc28e5dd045f9ce13ce6885b40bfa6ce"} Mar 11 19:22:47 crc kubenswrapper[4842]: I0311 19:22:47.949634 4842 generic.go:334] "Generic (PLEG): container finished" podID="d7692542-b7f6-4720-a4d7-46839a09792d" containerID="1854ee38df937e7f30868ec7405a09f697d8be7cf595110682c45572dd57640d" exitCode=0 Mar 11 19:22:47 crc kubenswrapper[4842]: I0311 19:22:47.950165 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" event={"ID":"d7692542-b7f6-4720-a4d7-46839a09792d","Type":"ContainerDied","Data":"1854ee38df937e7f30868ec7405a09f697d8be7cf595110682c45572dd57640d"} Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.963460 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.971295 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.971365 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" podUID="f574839a-5168-492d-a22b-ecfbed63b274" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.977288 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.980408 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.989841 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 19:22:47 crc kubenswrapper[4842]: E0311 19:22:47.989924 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-scheduler-0" podUID="3fd5d983-467b-4c01-aec7-079a61880193" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.041506 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.146961 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spkrc\" (UniqueName: \"kubernetes.io/projected/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-kube-api-access-spkrc\") pod \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.147080 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-config-data\") pod \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\" (UID: \"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1\") " Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.182051 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-kube-api-access-spkrc" (OuterVolumeSpecName: "kube-api-access-spkrc") pod "b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" (UID: "b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1"). InnerVolumeSpecName "kube-api-access-spkrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.206514 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-config-data" (OuterVolumeSpecName: "config-data") pod "b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" (UID: "b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.249633 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.249669 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spkrc\" (UniqueName: \"kubernetes.io/projected/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1-kube-api-access-spkrc\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.284929 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.331981 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.453970 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc99k\" (UniqueName: \"kubernetes.io/projected/8e539e1f-8358-4179-8da7-edfd18eb6537-kube-api-access-kc99k\") pod \"8e539e1f-8358-4179-8da7-edfd18eb6537\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.454066 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e539e1f-8358-4179-8da7-edfd18eb6537-operator-scripts\") pod \"8e539e1f-8358-4179-8da7-edfd18eb6537\" (UID: \"8e539e1f-8358-4179-8da7-edfd18eb6537\") " Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.454239 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp56s\" (UniqueName: \"kubernetes.io/projected/7229ac46-328f-479a-8ffa-3e28680ccabc-kube-api-access-hp56s\") pod \"7229ac46-328f-479a-8ffa-3e28680ccabc\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.454317 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7229ac46-328f-479a-8ffa-3e28680ccabc-operator-scripts\") pod \"7229ac46-328f-479a-8ffa-3e28680ccabc\" (UID: \"7229ac46-328f-479a-8ffa-3e28680ccabc\") " Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.454822 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e539e1f-8358-4179-8da7-edfd18eb6537-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e539e1f-8358-4179-8da7-edfd18eb6537" (UID: "8e539e1f-8358-4179-8da7-edfd18eb6537"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.454875 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7229ac46-328f-479a-8ffa-3e28680ccabc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7229ac46-328f-479a-8ffa-3e28680ccabc" (UID: "7229ac46-328f-479a-8ffa-3e28680ccabc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.457341 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e539e1f-8358-4179-8da7-edfd18eb6537-kube-api-access-kc99k" (OuterVolumeSpecName: "kube-api-access-kc99k") pod "8e539e1f-8358-4179-8da7-edfd18eb6537" (UID: "8e539e1f-8358-4179-8da7-edfd18eb6537"). InnerVolumeSpecName "kube-api-access-kc99k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.457393 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7229ac46-328f-479a-8ffa-3e28680ccabc-kube-api-access-hp56s" (OuterVolumeSpecName: "kube-api-access-hp56s") pod "7229ac46-328f-479a-8ffa-3e28680ccabc" (UID: "7229ac46-328f-479a-8ffa-3e28680ccabc"). InnerVolumeSpecName "kube-api-access-hp56s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.556219 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp56s\" (UniqueName: \"kubernetes.io/projected/7229ac46-328f-479a-8ffa-3e28680ccabc-kube-api-access-hp56s\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.556280 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7229ac46-328f-479a-8ffa-3e28680ccabc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.556291 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc99k\" (UniqueName: \"kubernetes.io/projected/8e539e1f-8358-4179-8da7-edfd18eb6537-kube-api-access-kc99k\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.556301 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e539e1f-8358-4179-8da7-edfd18eb6537-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.964169 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.965968 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.966762 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.1.25:8774/\": dial tcp 10.217.1.25:8774: connect: connection refused" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.966842 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-api-0" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-api" probeResult="failure" output="Get \"http://10.217.1.25:8774/\": dial tcp 10.217.1.25:8774: connect: connection refused" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.967812 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.989216 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa949ad8-639a-4fc1-b4ae-b021fd3bd425" path="/var/lib/kubelet/pods/fa949ad8-639a-4fc1-b4ae-b021fd3bd425/volumes" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.990679 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell0187d-account-delete-hghj2" event={"ID":"7229ac46-328f-479a-8ffa-3e28680ccabc","Type":"ContainerDied","Data":"fb00b595f17edfd130dd2c7bbee7d8b44eb8e2af0c974804bdb790e62217928f"} Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.990717 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb00b595f17edfd130dd2c7bbee7d8b44eb8e2af0c974804bdb790e62217928f" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.990729 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novacell173b3-account-delete-5xnf6" event={"ID":"8e539e1f-8358-4179-8da7-edfd18eb6537","Type":"ContainerDied","Data":"e3de324f2c7cd9781ae555c78f08e5a57c625ea946a9dc5810efe029811f763d"} Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.990739 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3de324f2c7cd9781ae555c78f08e5a57c625ea946a9dc5810efe029811f763d" Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.990747 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-conductor-0" event={"ID":"b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1","Type":"ContainerDied","Data":"348968470c193dd3e43e67205c5d97f8bb4f25b1fdf70f23464358fc8e783b0c"} Mar 11 19:22:48 crc kubenswrapper[4842]: I0311 19:22:48.990768 4842 scope.go:117] "RemoveContainer" containerID="a4a9008cebab810984a179f74dd3c1c4bc28e5dd045f9ce13ce6885b40bfa6ce" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.182109 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.196427 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-conductor-0"] Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.431425 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.455058 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.478041 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.574874 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d786c50e-6bfb-4b96-bec9-cfa618ab848a-logs\") pod \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.574954 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75caedc3-0ec9-4f3f-a381-8459fe9cad15-config-data\") pod \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.574974 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d786c50e-6bfb-4b96-bec9-cfa618ab848a-config-data\") pod \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.575001 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv62b\" (UniqueName: \"kubernetes.io/projected/d786c50e-6bfb-4b96-bec9-cfa618ab848a-kube-api-access-zv62b\") pod \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\" (UID: \"d786c50e-6bfb-4b96-bec9-cfa618ab848a\") " Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.575044 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75caedc3-0ec9-4f3f-a381-8459fe9cad15-logs\") pod \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.575101 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sfx9\" (UniqueName: \"kubernetes.io/projected/75caedc3-0ec9-4f3f-a381-8459fe9cad15-kube-api-access-8sfx9\") pod \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\" (UID: \"75caedc3-0ec9-4f3f-a381-8459fe9cad15\") " Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.575134 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh6l7\" (UniqueName: \"kubernetes.io/projected/d7692542-b7f6-4720-a4d7-46839a09792d-kube-api-access-gh6l7\") pod \"d7692542-b7f6-4720-a4d7-46839a09792d\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.575165 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7692542-b7f6-4720-a4d7-46839a09792d-operator-scripts\") pod \"d7692542-b7f6-4720-a4d7-46839a09792d\" (UID: \"d7692542-b7f6-4720-a4d7-46839a09792d\") " Mar 11 19:22:49 crc kubenswrapper[4842]: E0311 19:22:49.575653 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-config-data: secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:49 crc kubenswrapper[4842]: E0311 19:22:49.575707 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data podName:257b074b-1b06-46f4-8120-1e377849955c nodeName:}" failed. No retries permitted until 2026-03-11 19:22:53.575691735 +0000 UTC m=+2019.223388015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data") pod "nova-kuttl-cell1-compute-fake1-compute-0" (UID: "257b074b-1b06-46f4-8120-1e377849955c") : secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.576532 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d786c50e-6bfb-4b96-bec9-cfa618ab848a-logs" (OuterVolumeSpecName: "logs") pod "d786c50e-6bfb-4b96-bec9-cfa618ab848a" (UID: "d786c50e-6bfb-4b96-bec9-cfa618ab848a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.578008 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75caedc3-0ec9-4f3f-a381-8459fe9cad15-logs" (OuterVolumeSpecName: "logs") pod "75caedc3-0ec9-4f3f-a381-8459fe9cad15" (UID: "75caedc3-0ec9-4f3f-a381-8459fe9cad15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.578132 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7692542-b7f6-4720-a4d7-46839a09792d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7692542-b7f6-4720-a4d7-46839a09792d" (UID: "d7692542-b7f6-4720-a4d7-46839a09792d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.582489 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d786c50e-6bfb-4b96-bec9-cfa618ab848a-kube-api-access-zv62b" (OuterVolumeSpecName: "kube-api-access-zv62b") pod "d786c50e-6bfb-4b96-bec9-cfa618ab848a" (UID: "d786c50e-6bfb-4b96-bec9-cfa618ab848a"). InnerVolumeSpecName "kube-api-access-zv62b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.582577 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7692542-b7f6-4720-a4d7-46839a09792d-kube-api-access-gh6l7" (OuterVolumeSpecName: "kube-api-access-gh6l7") pod "d7692542-b7f6-4720-a4d7-46839a09792d" (UID: "d7692542-b7f6-4720-a4d7-46839a09792d"). InnerVolumeSpecName "kube-api-access-gh6l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.583639 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75caedc3-0ec9-4f3f-a381-8459fe9cad15-kube-api-access-8sfx9" (OuterVolumeSpecName: "kube-api-access-8sfx9") pod "75caedc3-0ec9-4f3f-a381-8459fe9cad15" (UID: "75caedc3-0ec9-4f3f-a381-8459fe9cad15"). InnerVolumeSpecName "kube-api-access-8sfx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.603197 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75caedc3-0ec9-4f3f-a381-8459fe9cad15-config-data" (OuterVolumeSpecName: "config-data") pod "75caedc3-0ec9-4f3f-a381-8459fe9cad15" (UID: "75caedc3-0ec9-4f3f-a381-8459fe9cad15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.604499 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d786c50e-6bfb-4b96-bec9-cfa618ab848a-config-data" (OuterVolumeSpecName: "config-data") pod "d786c50e-6bfb-4b96-bec9-cfa618ab848a" (UID: "d786c50e-6bfb-4b96-bec9-cfa618ab848a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.676952 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d786c50e-6bfb-4b96-bec9-cfa618ab848a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.677442 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75caedc3-0ec9-4f3f-a381-8459fe9cad15-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.677504 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d786c50e-6bfb-4b96-bec9-cfa618ab848a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.677582 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv62b\" (UniqueName: \"kubernetes.io/projected/d786c50e-6bfb-4b96-bec9-cfa618ab848a-kube-api-access-zv62b\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.677645 4842 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75caedc3-0ec9-4f3f-a381-8459fe9cad15-logs\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.677709 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sfx9\" (UniqueName: \"kubernetes.io/projected/75caedc3-0ec9-4f3f-a381-8459fe9cad15-kube-api-access-8sfx9\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.677763 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh6l7\" (UniqueName: \"kubernetes.io/projected/d7692542-b7f6-4720-a4d7-46839a09792d-kube-api-access-gh6l7\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.677815 4842 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7692542-b7f6-4720-a4d7-46839a09792d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.924925 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.980935 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" event={"ID":"d7692542-b7f6-4720-a4d7-46839a09792d","Type":"ContainerDied","Data":"c7ef68a85a9ea6198320fca786f1a9caf61c11fae56d7e279a6a89ce9c18a738"} Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.980988 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7ef68a85a9ea6198320fca786f1a9caf61c11fae56d7e279a6a89ce9c18a738" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.980998 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/novaapid894-account-delete-g8k4l" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.984694 4842 generic.go:334] "Generic (PLEG): container finished" podID="f574839a-5168-492d-a22b-ecfbed63b274" containerID="2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" exitCode=0 Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.984740 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"f574839a-5168-492d-a22b-ecfbed63b274","Type":"ContainerDied","Data":"2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8"} Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.984811 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" event={"ID":"f574839a-5168-492d-a22b-ecfbed63b274","Type":"ContainerDied","Data":"3228f88e8d951f473d152f9f48caa493a1640c7e26698c5e1d6dcba50ce26578"} Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.984829 4842 scope.go:117] "RemoveContainer" containerID="2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.984824 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell0-conductor-0" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.990687 4842 generic.go:334] "Generic (PLEG): container finished" podID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerID="4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e" exitCode=0 Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.990893 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-metadata-0" Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.991123 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"d786c50e-6bfb-4b96-bec9-cfa618ab848a","Type":"ContainerDied","Data":"4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e"} Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.991155 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-metadata-0" event={"ID":"d786c50e-6bfb-4b96-bec9-cfa618ab848a","Type":"ContainerDied","Data":"877bf2a539615cbfbcacce68034981e4d4901ab49bef3b48073081dfe7067a4e"} Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.997507 4842 generic.go:334] "Generic (PLEG): container finished" podID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerID="870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b" exitCode=0 Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.997558 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"75caedc3-0ec9-4f3f-a381-8459fe9cad15","Type":"ContainerDied","Data":"870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b"} Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.997591 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-api-0" event={"ID":"75caedc3-0ec9-4f3f-a381-8459fe9cad15","Type":"ContainerDied","Data":"96aeec149283d8a185f8370a9b84ae82c22eca755733ce82ef95851639c4582c"} Mar 11 19:22:49 crc kubenswrapper[4842]: I0311 19:22:49.997658 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-api-0" Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.013198 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.014897 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.017037 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.017126 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.038616 4842 scope.go:117] "RemoveContainer" containerID="2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.043238 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8\": container with ID starting with 2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8 not found: ID does not exist" containerID="2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.043310 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8"} err="failed to get container status \"2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8\": rpc error: code = NotFound desc = could not find container \"2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8\": container with ID starting with 2eae609778adeb886b70785d1fa4e2edc246dbc42ca139871404c428061226b8 not found: ID does not exist" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.043353 4842 scope.go:117] "RemoveContainer" containerID="4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.056037 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.081690 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-api-0"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.082422 4842 scope.go:117] "RemoveContainer" containerID="46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.085784 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7ft5\" (UniqueName: \"kubernetes.io/projected/f574839a-5168-492d-a22b-ecfbed63b274-kube-api-access-r7ft5\") pod \"f574839a-5168-492d-a22b-ecfbed63b274\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.085878 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f574839a-5168-492d-a22b-ecfbed63b274-config-data\") pod \"f574839a-5168-492d-a22b-ecfbed63b274\" (UID: \"f574839a-5168-492d-a22b-ecfbed63b274\") " Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.088287 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.096540 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f574839a-5168-492d-a22b-ecfbed63b274-kube-api-access-r7ft5" (OuterVolumeSpecName: "kube-api-access-r7ft5") pod "f574839a-5168-492d-a22b-ecfbed63b274" (UID: "f574839a-5168-492d-a22b-ecfbed63b274"). InnerVolumeSpecName "kube-api-access-r7ft5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.096736 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-metadata-0"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.098545 4842 scope.go:117] "RemoveContainer" containerID="4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e" Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.098995 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e\": container with ID starting with 4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e not found: ID does not exist" containerID="4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.099027 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e"} err="failed to get container status \"4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e\": rpc error: code = NotFound desc = could not find container \"4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e\": container with ID starting with 4bdf37e7bf5be1a0c0901df559020e4a8c53b7fe32994706d80db06fcf27003e not found: ID does not exist" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.099049 4842 scope.go:117] "RemoveContainer" containerID="46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7" Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.099359 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7\": container with ID starting with 46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7 not found: ID does not exist" containerID="46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.099400 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7"} err="failed to get container status \"46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7\": rpc error: code = NotFound desc = could not find container \"46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7\": container with ID starting with 46e4c472b0c6139160513a3202ba1732fdd1d90caffc52e1b52542ac627b69a7 not found: ID does not exist" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.099433 4842 scope.go:117] "RemoveContainer" containerID="870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.110187 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f574839a-5168-492d-a22b-ecfbed63b274-config-data" (OuterVolumeSpecName: "config-data") pod "f574839a-5168-492d-a22b-ecfbed63b274" (UID: "f574839a-5168-492d-a22b-ecfbed63b274"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.170515 4842 scope.go:117] "RemoveContainer" containerID="478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.186060 4842 scope.go:117] "RemoveContainer" containerID="870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b" Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.186490 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b\": container with ID starting with 870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b not found: ID does not exist" containerID="870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.186525 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b"} err="failed to get container status \"870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b\": rpc error: code = NotFound desc = could not find container \"870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b\": container with ID starting with 870a6b09947ba9c3bd949f8bbf6841c6b625ed5c96738975e0399a5ccc30a86b not found: ID does not exist" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.186554 4842 scope.go:117] "RemoveContainer" containerID="478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80" Mar 11 19:22:50 crc kubenswrapper[4842]: E0311 19:22:50.186820 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80\": container with ID starting with 478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80 not found: ID does not exist" containerID="478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.186847 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80"} err="failed to get container status \"478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80\": rpc error: code = NotFound desc = could not find container \"478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80\": container with ID starting with 478331dde361d706ecbddc24b1ee75919a32378b1a5e5ef1250cb0464f48fa80 not found: ID does not exist" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.187539 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f574839a-5168-492d-a22b-ecfbed63b274-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.187558 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7ft5\" (UniqueName: \"kubernetes.io/projected/f574839a-5168-492d-a22b-ecfbed63b274-kube-api-access-r7ft5\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.321918 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.331212 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell0-conductor-0"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.474303 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-b7dsg"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.483042 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-db-create-b7dsg"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.491325 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.499245 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell0187d-account-delete-hghj2"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.508511 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell0187d-account-delete-hghj2"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.515902 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell0-187d-account-create-update-lbfmn"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.550520 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-76zdk"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.562433 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-db-create-76zdk"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.568804 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.576711 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novacell173b3-account-delete-5xnf6"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.583640 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novacell173b3-account-delete-5xnf6"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.588893 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-cell1-73b3-account-create-update-sj8dw"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.652490 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-db-create-nnvg2"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.660878 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-db-create-nnvg2"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.667221 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-api-d894-account-create-update-m4xhn"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.672601 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/novaapid894-account-delete-g8k4l"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.678657 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-api-d894-account-create-update-m4xhn"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.683974 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/novaapid894-account-delete-g8k4l"] Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.976049 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0054e5b3-ea14-46c7-8742-8f9c9ff9a705" path="/var/lib/kubelet/pods/0054e5b3-ea14-46c7-8742-8f9c9ff9a705/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.977027 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1580830f-2870-436d-b982-a1775fc494bb" path="/var/lib/kubelet/pods/1580830f-2870-436d-b982-a1775fc494bb/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.977500 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b02d76-eca4-492a-b9c7-29a77627d816" path="/var/lib/kubelet/pods/23b02d76-eca4-492a-b9c7-29a77627d816/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.977961 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ec451f-d841-42fc-a6ab-3d81f62be3df" path="/var/lib/kubelet/pods/28ec451f-d841-42fc-a6ab-3d81f62be3df/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.978934 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f67413c-94d7-4948-aec7-086827349cc6" path="/var/lib/kubelet/pods/4f67413c-94d7-4948-aec7-086827349cc6/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.979445 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6134a356-636f-4379-9bd1-86db49454ca5" path="/var/lib/kubelet/pods/6134a356-636f-4379-9bd1-86db49454ca5/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.979896 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7229ac46-328f-479a-8ffa-3e28680ccabc" path="/var/lib/kubelet/pods/7229ac46-328f-479a-8ffa-3e28680ccabc/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.980831 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" path="/var/lib/kubelet/pods/75caedc3-0ec9-4f3f-a381-8459fe9cad15/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.981331 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e539e1f-8358-4179-8da7-edfd18eb6537" path="/var/lib/kubelet/pods/8e539e1f-8358-4179-8da7-edfd18eb6537/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.981790 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" path="/var/lib/kubelet/pods/b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.982726 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7692542-b7f6-4720-a4d7-46839a09792d" path="/var/lib/kubelet/pods/d7692542-b7f6-4720-a4d7-46839a09792d/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.983211 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" path="/var/lib/kubelet/pods/d786c50e-6bfb-4b96-bec9-cfa618ab848a/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.983821 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f574839a-5168-492d-a22b-ecfbed63b274" path="/var/lib/kubelet/pods/f574839a-5168-492d-a22b-ecfbed63b274/volumes" Mar 11 19:22:50 crc kubenswrapper[4842]: I0311 19:22:50.989020 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_keystone-64f57b6d8c-cz78k_4291a0cb-5c38-424b-bc49-301aab1e1f1a/keystone-api/0.log" Mar 11 19:22:51 crc kubenswrapper[4842]: I0311 19:22:51.610072 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-novncproxy-0" podUID="fa949ad8-639a-4fc1-b4ae-b021fd3bd425" containerName="nova-kuttl-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.1.11:6080/vnc_lite.html\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:22:51 crc kubenswrapper[4842]: I0311 19:22:51.898044 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.014710 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dskz\" (UniqueName: \"kubernetes.io/projected/3fd5d983-467b-4c01-aec7-079a61880193-kube-api-access-5dskz\") pod \"3fd5d983-467b-4c01-aec7-079a61880193\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.014777 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd5d983-467b-4c01-aec7-079a61880193-config-data\") pod \"3fd5d983-467b-4c01-aec7-079a61880193\" (UID: \"3fd5d983-467b-4c01-aec7-079a61880193\") " Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.024177 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd5d983-467b-4c01-aec7-079a61880193-kube-api-access-5dskz" (OuterVolumeSpecName: "kube-api-access-5dskz") pod "3fd5d983-467b-4c01-aec7-079a61880193" (UID: "3fd5d983-467b-4c01-aec7-079a61880193"). InnerVolumeSpecName "kube-api-access-5dskz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.027729 4842 generic.go:334] "Generic (PLEG): container finished" podID="3fd5d983-467b-4c01-aec7-079a61880193" containerID="cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" exitCode=0 Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.027776 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"3fd5d983-467b-4c01-aec7-079a61880193","Type":"ContainerDied","Data":"cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b"} Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.027807 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-scheduler-0" event={"ID":"3fd5d983-467b-4c01-aec7-079a61880193","Type":"ContainerDied","Data":"5bc9f33fc9d00826b9790a5a91fbd5a288ae9d5a9661503c1b1cd04a120e6025"} Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.027826 4842 scope.go:117] "RemoveContainer" containerID="cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.027991 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-scheduler-0" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.049424 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd5d983-467b-4c01-aec7-079a61880193-config-data" (OuterVolumeSpecName: "config-data") pod "3fd5d983-467b-4c01-aec7-079a61880193" (UID: "3fd5d983-467b-4c01-aec7-079a61880193"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.113792 4842 scope.go:117] "RemoveContainer" containerID="cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" Mar 11 19:22:52 crc kubenswrapper[4842]: E0311 19:22:52.114490 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b\": container with ID starting with cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b not found: ID does not exist" containerID="cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.114522 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b"} err="failed to get container status \"cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b\": rpc error: code = NotFound desc = could not find container \"cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b\": container with ID starting with cba87a2ead016519da49fe05643c2693793e55195ab34e089313dad72849197b not found: ID does not exist" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.116701 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dskz\" (UniqueName: \"kubernetes.io/projected/3fd5d983-467b-4c01-aec7-079a61880193-kube-api-access-5dskz\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.116810 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd5d983-467b-4c01-aec7-079a61880193-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.368537 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.376474 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-scheduler-0"] Mar 11 19:22:52 crc kubenswrapper[4842]: I0311 19:22:52.972011 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd5d983-467b-4c01-aec7-079a61880193" path="/var/lib/kubelet/pods/3fd5d983-467b-4c01-aec7-079a61880193/volumes" Mar 11 19:22:53 crc kubenswrapper[4842]: E0311 19:22:53.640531 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-config-data: secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:53 crc kubenswrapper[4842]: E0311 19:22:53.640607 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data podName:257b074b-1b06-46f4-8120-1e377849955c nodeName:}" failed. No retries permitted until 2026-03-11 19:23:01.640590152 +0000 UTC m=+2027.288286452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data") pod "nova-kuttl-cell1-compute-fake1-compute-0" (UID: "257b074b-1b06-46f4-8120-1e377849955c") : secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:22:54 crc kubenswrapper[4842]: I0311 19:22:54.421017 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_memcached-0_6f57e7eb-fa53-4182-9531-a3ebcd1df17c/memcached/0.log" Mar 11 19:22:55 crc kubenswrapper[4842]: E0311 19:22:54.999880 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:55 crc kubenswrapper[4842]: E0311 19:22:55.001816 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:55 crc kubenswrapper[4842]: E0311 19:22:55.003423 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:22:55 crc kubenswrapper[4842]: E0311 19:22:55.003508 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:22:57 crc kubenswrapper[4842]: I0311 19:22:57.728074 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_nova-kuttl-cell1-compute-fake1-compute-0_257b074b-1b06-46f4-8120-1e377849955c/nova-kuttl-cell1-compute-fake1-compute-compute/2.log" Mar 11 19:22:59 crc kubenswrapper[4842]: I0311 19:22:59.658046 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-cell1-galera-0_0e137603-1bc4-4ccf-ba33-09993a8e6e79/galera/0.log" Mar 11 19:23:00 crc kubenswrapper[4842]: E0311 19:23:00.000011 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:00 crc kubenswrapper[4842]: E0311 19:23:00.001071 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:00 crc kubenswrapper[4842]: E0311 19:23:00.002285 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:00 crc kubenswrapper[4842]: E0311 19:23:00.002320 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:00 crc kubenswrapper[4842]: I0311 19:23:00.046853 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-galera-0_2b22b349-fc5f-4da6-818f-412f7dde5f00/galera/0.log" Mar 11 19:23:00 crc kubenswrapper[4842]: I0311 19:23:00.432802 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstackclient_b7dcae57-2024-4bfa-b657-f16d16bfd6c7/openstackclient/0.log" Mar 11 19:23:00 crc kubenswrapper[4842]: I0311 19:23:00.848313 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_placement-79b56db87d-ltvb2_35af45e3-739f-4769-a843-c951ad001e2e/placement-log/0.log" Mar 11 19:23:01 crc kubenswrapper[4842]: I0311 19:23:01.264804 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-broadcaster-server-0_baa6ffd5-2b78-4119-b6f1-a70465d5288d/rabbitmq/0.log" Mar 11 19:23:01 crc kubenswrapper[4842]: I0311 19:23:01.471698 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:23:01 crc kubenswrapper[4842]: I0311 19:23:01.471780 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:23:01 crc kubenswrapper[4842]: E0311 19:23:01.668622 4842 secret.go:188] Couldn't get secret nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-config-data: secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:23:01 crc kubenswrapper[4842]: E0311 19:23:01.668747 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data podName:257b074b-1b06-46f4-8120-1e377849955c nodeName:}" failed. No retries permitted until 2026-03-11 19:23:17.668720015 +0000 UTC m=+2043.316416305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data") pod "nova-kuttl-cell1-compute-fake1-compute-0" (UID: "257b074b-1b06-46f4-8120-1e377849955c") : secret "nova-kuttl-cell1-compute-fake1-compute-config-data" not found Mar 11 19:23:01 crc kubenswrapper[4842]: I0311 19:23:01.698031 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-cell1-server-0_e12d431f-86df-44d1-9877-3eb3c698d089/rabbitmq/0.log" Mar 11 19:23:02 crc kubenswrapper[4842]: I0311 19:23:02.138000 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-notifications-server-0_8101bb7b-9fb5-418b-b490-e465171babc5/rabbitmq/0.log" Mar 11 19:23:02 crc kubenswrapper[4842]: I0311 19:23:02.551032 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-server-0_13c13109-88f5-4c0d-9c15-739f9622af9d/rabbitmq/0.log" Mar 11 19:23:04 crc kubenswrapper[4842]: E0311 19:23:04.999486 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:05 crc kubenswrapper[4842]: E0311 19:23:05.002262 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:05 crc kubenswrapper[4842]: E0311 19:23:05.003507 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:05 crc kubenswrapper[4842]: E0311 19:23:05.003546 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:10 crc kubenswrapper[4842]: E0311 19:23:10.003676 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:10 crc kubenswrapper[4842]: E0311 19:23:10.011187 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:10 crc kubenswrapper[4842]: E0311 19:23:10.016223 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:10 crc kubenswrapper[4842]: E0311 19:23:10.016323 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:15 crc kubenswrapper[4842]: E0311 19:23:15.000218 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:15 crc kubenswrapper[4842]: E0311 19:23:15.002406 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:15 crc kubenswrapper[4842]: E0311 19:23:15.004003 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-compute"] Mar 11 19:23:15 crc kubenswrapper[4842]: E0311 19:23:15.004086 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.248279 4842 generic.go:334] "Generic (PLEG): container finished" podID="257b074b-1b06-46f4-8120-1e377849955c" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" exitCode=137 Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.248390 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerDied","Data":"fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0"} Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.249398 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" event={"ID":"257b074b-1b06-46f4-8120-1e377849955c","Type":"ContainerDied","Data":"2475dffb3384e6aae3ed504b4d1993ae6d0ddf3500002e7725ffab137c554a04"} Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.249420 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2475dffb3384e6aae3ed504b4d1993ae6d0ddf3500002e7725ffab137c554a04" Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.249441 4842 scope.go:117] "RemoveContainer" containerID="b728604be5eb64805c476c48c2cbe396364e3e38990b7e6a9da6666c3dd8648b" Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.301559 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.444039 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9q64\" (UniqueName: \"kubernetes.io/projected/257b074b-1b06-46f4-8120-1e377849955c-kube-api-access-w9q64\") pod \"257b074b-1b06-46f4-8120-1e377849955c\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.444483 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data\") pod \"257b074b-1b06-46f4-8120-1e377849955c\" (UID: \"257b074b-1b06-46f4-8120-1e377849955c\") " Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.455542 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257b074b-1b06-46f4-8120-1e377849955c-kube-api-access-w9q64" (OuterVolumeSpecName: "kube-api-access-w9q64") pod "257b074b-1b06-46f4-8120-1e377849955c" (UID: "257b074b-1b06-46f4-8120-1e377849955c"). InnerVolumeSpecName "kube-api-access-w9q64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.468113 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data" (OuterVolumeSpecName: "config-data") pod "257b074b-1b06-46f4-8120-1e377849955c" (UID: "257b074b-1b06-46f4-8120-1e377849955c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.546392 4842 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/257b074b-1b06-46f4-8120-1e377849955c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 19:23:17 crc kubenswrapper[4842]: I0311 19:23:17.546427 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9q64\" (UniqueName: \"kubernetes.io/projected/257b074b-1b06-46f4-8120-1e377849955c-kube-api-access-w9q64\") on node \"crc\" DevicePath \"\"" Mar 11 19:23:18 crc kubenswrapper[4842]: I0311 19:23:18.261593 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0" Mar 11 19:23:18 crc kubenswrapper[4842]: I0311 19:23:18.293055 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:23:18 crc kubenswrapper[4842]: I0311 19:23:18.299083 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["nova-kuttl-default/nova-kuttl-cell1-compute-fake1-compute-0"] Mar 11 19:23:18 crc kubenswrapper[4842]: I0311 19:23:18.971534 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257b074b-1b06-46f4-8120-1e377849955c" path="/var/lib/kubelet/pods/257b074b-1b06-46f4-8120-1e377849955c/volumes" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.308547 4842 scope.go:117] "RemoveContainer" containerID="b9b07fbe672763ff024113057c1090a46efba5122d370ec0c70d7beb8256f443" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.347056 4842 scope.go:117] "RemoveContainer" containerID="f5c5e0d16de8e52cda1d25b45574f431d9af1c2ed4ac9c042a73865e25f653e9" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.367965 4842 scope.go:117] "RemoveContainer" containerID="b3c7f38c9a7a92bbe8480c9418530580b1bcd8873b264605f21486b39f5344e9" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.402067 4842 scope.go:117] "RemoveContainer" containerID="96858c3c26bffd17951c0120a7e0865a7b90b70ba13dccd00f694bb5067e9636" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.441145 4842 scope.go:117] "RemoveContainer" containerID="fb21b506ca3956bb6ad0d1b5272ad2bf7fd9b0e9c4b81b82340db9b3a1e3da95" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.459019 4842 scope.go:117] "RemoveContainer" containerID="4f9ff140f937d2755384ceb7ecb2094811c0487e008c06235ccfafaeffbda3d9" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.471956 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.472010 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.472061 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.472853 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa35ccc7978f645aca41cd60ffb442586f1c1d0afa2c03aac66ec6981fadd10b"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.473048 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://aa35ccc7978f645aca41cd60ffb442586f1c1d0afa2c03aac66ec6981fadd10b" gracePeriod=600 Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.487440 4842 scope.go:117] "RemoveContainer" containerID="ba09ac168f1c10ac0a05210d2f655e06e8c6176dd3fcf7814e9683eb36d4c425" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.514414 4842 scope.go:117] "RemoveContainer" containerID="47619cbc0df4fb985600d1822015290e0b829210d1ab7c6623e688f54c1a4fb5" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.555801 4842 scope.go:117] "RemoveContainer" containerID="10a78b515298434e21790e36bedfcd2448cb9b6076f2a7cbdd850500bd417a84" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.573983 4842 scope.go:117] "RemoveContainer" containerID="93583afffb96aa0e89e642e781d2da3566fa59c9b2f599d8a928dc2f2f015c5e" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.591482 4842 scope.go:117] "RemoveContainer" containerID="39665699be4ef9ff883dcc65b468ba9074af72d2a74d759ea8562d58654fd331" Mar 11 19:23:31 crc kubenswrapper[4842]: I0311 19:23:31.617140 4842 scope.go:117] "RemoveContainer" containerID="9a0648be5ad20533b3bc5df12e5ec9b79576f8a6335b3eedb2acbfe909d24253" Mar 11 19:23:32 crc kubenswrapper[4842]: I0311 19:23:32.266660 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/extract/0.log" Mar 11 19:23:32 crc kubenswrapper[4842]: I0311 19:23:32.394682 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="aa35ccc7978f645aca41cd60ffb442586f1c1d0afa2c03aac66ec6981fadd10b" exitCode=0 Mar 11 19:23:32 crc kubenswrapper[4842]: I0311 19:23:32.394741 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"aa35ccc7978f645aca41cd60ffb442586f1c1d0afa2c03aac66ec6981fadd10b"} Mar 11 19:23:32 crc kubenswrapper[4842]: I0311 19:23:32.395207 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92"} Mar 11 19:23:32 crc kubenswrapper[4842]: I0311 19:23:32.395276 4842 scope.go:117] "RemoveContainer" containerID="a74b26f9155706c4ec6ac3e8f8776efdf91bef2b26bc68f08a1e3699bf335cd3" Mar 11 19:23:32 crc kubenswrapper[4842]: I0311 19:23:32.654510 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/extract/0.log" Mar 11 19:23:39 crc kubenswrapper[4842]: I0311 19:23:39.763806 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-n67cw_a59c06c7-f7ea-4d35-9053-2d969ec7e7f9/manager/0.log" Mar 11 19:23:41 crc kubenswrapper[4842]: I0311 19:23:41.472713 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-sxlvs_cdb4f878-df19-48ad-bd71-88583edeb32a/manager/0.log" Mar 11 19:23:41 crc kubenswrapper[4842]: I0311 19:23:41.871643 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-w9hp6_763d79b9-8982-4ef6-8bc7-c2378f8208f0/manager/0.log" Mar 11 19:23:42 crc kubenswrapper[4842]: I0311 19:23:42.281804 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-m8dg9_de16110e-c77e-4513-b74b-86097ceb5a7d/manager/0.log" Mar 11 19:23:42 crc kubenswrapper[4842]: I0311 19:23:42.671763 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-chrh5_6211c7b4-3c01-49bc-9f4e-59872605f5fe/manager/0.log" Mar 11 19:23:43 crc kubenswrapper[4842]: I0311 19:23:43.066080 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-2vd9b_d87344b8-890b-4457-8f09-ec98bea8300e/manager/0.log" Mar 11 19:23:43 crc kubenswrapper[4842]: I0311 19:23:43.643420 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-dkj58_80959ea3-dca7-4a95-b049-d8df7ebd0ce0/manager/0.log" Mar 11 19:23:43 crc kubenswrapper[4842]: I0311 19:23:43.994746 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-677rf_024796ba-bf60-48db-962e-5d8bf962c127/manager/0.log" Mar 11 19:23:44 crc kubenswrapper[4842]: I0311 19:23:44.501505 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-btk6h_9c018477-14f2-4729-949a-25a46eae03ef/manager/0.log" Mar 11 19:23:44 crc kubenswrapper[4842]: I0311 19:23:44.979869 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-22vbs_bffda318-ec25-4b92-992b-50cf5fb2f6a5/manager/0.log" Mar 11 19:23:45 crc kubenswrapper[4842]: I0311 19:23:45.421603 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-gbfbf_935542fd-daef-458a-b3fe-e2d8291d6c44/manager/0.log" Mar 11 19:23:45 crc kubenswrapper[4842]: I0311 19:23:45.811833 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-qd5nx_ab4b8857-4909-4289-888e-711796d175d8/manager/0.log" Mar 11 19:23:46 crc kubenswrapper[4842]: I0311 19:23:46.749505 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6f598d9474-l5k2t_c4b3af5a-7447-41c9-8cc0-5e927157aecf/manager/0.log" Mar 11 19:23:47 crc kubenswrapper[4842]: I0311 19:23:47.154981 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-index-wln8t_e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46/registry-server/0.log" Mar 11 19:23:47 crc kubenswrapper[4842]: I0311 19:23:47.533340 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-f9xmb_efd1a4f4-f73f-425c-87e9-a63681ca5466/manager/0.log" Mar 11 19:23:47 crc kubenswrapper[4842]: I0311 19:23:47.904605 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b785bdc_463a4e68-9555-4065-aed2-91cdc5570602/manager/0.log" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.524992 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-btp4n"] Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528037 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd5d983-467b-4c01-aec7-079a61880193" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528075 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd5d983-467b-4c01-aec7-079a61880193" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528099 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7692542-b7f6-4720-a4d7-46839a09792d" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528108 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7692542-b7f6-4720-a4d7-46839a09792d" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528121 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-log" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528128 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-log" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528139 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-log" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528146 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-log" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528164 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f574839a-5168-492d-a22b-ecfbed63b274" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528171 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="f574839a-5168-492d-a22b-ecfbed63b274" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528186 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-api" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528195 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-api" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528206 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa949ad8-639a-4fc1-b4ae-b021fd3bd425" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528213 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa949ad8-639a-4fc1-b4ae-b021fd3bd425" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528228 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528234 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528244 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7229ac46-328f-479a-8ffa-3e28680ccabc" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528250 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="7229ac46-328f-479a-8ffa-3e28680ccabc" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528266 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-metadata" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528294 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-metadata" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528305 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528312 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528325 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e539e1f-8358-4179-8da7-edfd18eb6537" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528333 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e539e1f-8358-4179-8da7-edfd18eb6537" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528341 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528348 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: E0311 19:23:48.528361 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528368 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528615 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd5d983-467b-4c01-aec7-079a61880193" containerName="nova-kuttl-scheduler-scheduler" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528634 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e539e1f-8358-4179-8da7-edfd18eb6537" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528650 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="f574839a-5168-492d-a22b-ecfbed63b274" containerName="nova-kuttl-cell0-conductor-conductor" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528658 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7692542-b7f6-4720-a4d7-46839a09792d" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528668 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa949ad8-639a-4fc1-b4ae-b021fd3bd425" containerName="nova-kuttl-cell1-novncproxy-novncproxy" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528678 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-log" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528687 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.528696 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="7229ac46-328f-479a-8ffa-3e28680ccabc" containerName="mariadb-account-delete" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.530325 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.530358 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f5f0cf-a44b-4dc1-b37d-0fc48ae884a1" containerName="nova-kuttl-cell1-conductor-conductor" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.530372 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-metadata" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.530381 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="75caedc3-0ec9-4f3f-a381-8459fe9cad15" containerName="nova-kuttl-api-api" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.530392 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="d786c50e-6bfb-4b96-bec9-cfa618ab848a" containerName="nova-kuttl-metadata-log" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.530750 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="257b074b-1b06-46f4-8120-1e377849955c" containerName="nova-kuttl-cell1-compute-fake1-compute-compute" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.531701 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.559145 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btp4n"] Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.632350 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x89h\" (UniqueName: \"kubernetes.io/projected/93527329-8daf-4a4a-af38-61d95f32c31c-kube-api-access-2x89h\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.632571 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-catalog-content\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.632614 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-utilities\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.673216 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7547d775f4-htzsf_314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00/manager/0.log" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.734031 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-catalog-content\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.734086 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-utilities\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.734154 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x89h\" (UniqueName: \"kubernetes.io/projected/93527329-8daf-4a4a-af38-61d95f32c31c-kube-api-access-2x89h\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.734879 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-utilities\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.735022 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-catalog-content\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.755183 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x89h\" (UniqueName: \"kubernetes.io/projected/93527329-8daf-4a4a-af38-61d95f32c31c-kube-api-access-2x89h\") pod \"redhat-marketplace-btp4n\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:48 crc kubenswrapper[4842]: I0311 19:23:48.875124 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.033493 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hvhrl_1b6f8f46-7c23-4380-b8e7-585c3e32ab04/registry-server/0.log" Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.309997 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btp4n"] Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.415603 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-m59rl_829f5c10-1f4f-4e84-a6ca-eba63ae106e2/manager/0.log" Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.557370 4842 generic.go:334] "Generic (PLEG): container finished" podID="93527329-8daf-4a4a-af38-61d95f32c31c" containerID="761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18" exitCode=0 Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.557620 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btp4n" event={"ID":"93527329-8daf-4a4a-af38-61d95f32c31c","Type":"ContainerDied","Data":"761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18"} Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.557734 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btp4n" event={"ID":"93527329-8daf-4a4a-af38-61d95f32c31c","Type":"ContainerStarted","Data":"45142bfc248ffca46926fb63ad9800eb8d0192585f2991412b079814f7589586"} Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.559430 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 19:23:49 crc kubenswrapper[4842]: I0311 19:23:49.769892 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-xdc5p_54acfc0e-ae41-490e-ba38-f88a427ff791/manager/0.log" Mar 11 19:23:50 crc kubenswrapper[4842]: I0311 19:23:50.141614 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7srdl_bfbdd09b-00b7-421c-911a-09e9720004f0/operator/0.log" Mar 11 19:23:50 crc kubenswrapper[4842]: I0311 19:23:50.511910 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-h5ksp_56f77fcb-3101-446e-b070-ff1dcda13209/manager/0.log" Mar 11 19:23:50 crc kubenswrapper[4842]: I0311 19:23:50.570479 4842 generic.go:334] "Generic (PLEG): container finished" podID="93527329-8daf-4a4a-af38-61d95f32c31c" containerID="afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42" exitCode=0 Mar 11 19:23:50 crc kubenswrapper[4842]: I0311 19:23:50.570548 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btp4n" event={"ID":"93527329-8daf-4a4a-af38-61d95f32c31c","Type":"ContainerDied","Data":"afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42"} Mar 11 19:23:50 crc kubenswrapper[4842]: I0311 19:23:50.940609 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-czq5g_868a1fe1-c01f-4a07-b8d5-2d02985cc29d/manager/0.log" Mar 11 19:23:51 crc kubenswrapper[4842]: I0311 19:23:51.352402 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-25rbp_0b197ad5-cb5c-483b-85c9-16578c56dd04/manager/0.log" Mar 11 19:23:51 crc kubenswrapper[4842]: I0311 19:23:51.580788 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btp4n" event={"ID":"93527329-8daf-4a4a-af38-61d95f32c31c","Type":"ContainerStarted","Data":"2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc"} Mar 11 19:23:51 crc kubenswrapper[4842]: I0311 19:23:51.599334 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-btp4n" podStartSLOduration=2.171505678 podStartE2EDuration="3.599312441s" podCreationTimestamp="2026-03-11 19:23:48 +0000 UTC" firstStartedPulling="2026-03-11 19:23:49.559167579 +0000 UTC m=+2075.206863859" lastFinishedPulling="2026-03-11 19:23:50.986974352 +0000 UTC m=+2076.634670622" observedRunningTime="2026-03-11 19:23:51.597141273 +0000 UTC m=+2077.244837553" watchObservedRunningTime="2026-03-11 19:23:51.599312441 +0000 UTC m=+2077.247008721" Mar 11 19:23:51 crc kubenswrapper[4842]: I0311 19:23:51.816117 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-hlvdg_a3e2f9c3-1a9b-441e-87ac-07e25d805293/manager/0.log" Mar 11 19:23:56 crc kubenswrapper[4842]: I0311 19:23:56.604720 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_keystone-64f57b6d8c-cz78k_4291a0cb-5c38-424b-bc49-301aab1e1f1a/keystone-api/0.log" Mar 11 19:23:58 crc kubenswrapper[4842]: I0311 19:23:58.875227 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:58 crc kubenswrapper[4842]: I0311 19:23:58.875488 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:58 crc kubenswrapper[4842]: I0311 19:23:58.917803 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:59 crc kubenswrapper[4842]: I0311 19:23:59.682098 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:23:59 crc kubenswrapper[4842]: I0311 19:23:59.727966 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btp4n"] Mar 11 19:23:59 crc kubenswrapper[4842]: I0311 19:23:59.776357 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_memcached-0_6f57e7eb-fa53-4182-9531-a3ebcd1df17c/memcached/0.log" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.142209 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554284-9c588"] Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.143468 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554284-9c588" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.146768 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.146840 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.147132 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.153523 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554284-9c588"] Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.215736 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtftj\" (UniqueName: \"kubernetes.io/projected/6827f555-4985-4131-8bc6-2df2bc76ed73-kube-api-access-vtftj\") pod \"auto-csr-approver-29554284-9c588\" (UID: \"6827f555-4985-4131-8bc6-2df2bc76ed73\") " pod="openshift-infra/auto-csr-approver-29554284-9c588" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.255987 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-cell1-galera-0_0e137603-1bc4-4ccf-ba33-09993a8e6e79/galera/0.log" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.317815 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtftj\" (UniqueName: \"kubernetes.io/projected/6827f555-4985-4131-8bc6-2df2bc76ed73-kube-api-access-vtftj\") pod \"auto-csr-approver-29554284-9c588\" (UID: \"6827f555-4985-4131-8bc6-2df2bc76ed73\") " pod="openshift-infra/auto-csr-approver-29554284-9c588" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.339239 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtftj\" (UniqueName: \"kubernetes.io/projected/6827f555-4985-4131-8bc6-2df2bc76ed73-kube-api-access-vtftj\") pod \"auto-csr-approver-29554284-9c588\" (UID: \"6827f555-4985-4131-8bc6-2df2bc76ed73\") " pod="openshift-infra/auto-csr-approver-29554284-9c588" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.496201 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554284-9c588" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.806026 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-galera-0_2b22b349-fc5f-4da6-818f-412f7dde5f00/galera/0.log" Mar 11 19:24:00 crc kubenswrapper[4842]: I0311 19:24:00.915543 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554284-9c588"] Mar 11 19:24:01 crc kubenswrapper[4842]: I0311 19:24:01.301885 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstackclient_b7dcae57-2024-4bfa-b657-f16d16bfd6c7/openstackclient/0.log" Mar 11 19:24:01 crc kubenswrapper[4842]: I0311 19:24:01.663469 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554284-9c588" event={"ID":"6827f555-4985-4131-8bc6-2df2bc76ed73","Type":"ContainerStarted","Data":"04761fa976fb6bed536e5b161cc8d1dc2b141c2f9ad3947b7afc81e1351e05e7"} Mar 11 19:24:01 crc kubenswrapper[4842]: I0311 19:24:01.663642 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-btp4n" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="registry-server" containerID="cri-o://2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc" gracePeriod=2 Mar 11 19:24:01 crc kubenswrapper[4842]: I0311 19:24:01.853886 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_placement-79b56db87d-ltvb2_35af45e3-739f-4769-a843-c951ad001e2e/placement-log/0.log" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.080775 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.253897 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-utilities\") pod \"93527329-8daf-4a4a-af38-61d95f32c31c\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.254026 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-catalog-content\") pod \"93527329-8daf-4a4a-af38-61d95f32c31c\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.254157 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x89h\" (UniqueName: \"kubernetes.io/projected/93527329-8daf-4a4a-af38-61d95f32c31c-kube-api-access-2x89h\") pod \"93527329-8daf-4a4a-af38-61d95f32c31c\" (UID: \"93527329-8daf-4a4a-af38-61d95f32c31c\") " Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.255322 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-utilities" (OuterVolumeSpecName: "utilities") pod "93527329-8daf-4a4a-af38-61d95f32c31c" (UID: "93527329-8daf-4a4a-af38-61d95f32c31c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.260728 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93527329-8daf-4a4a-af38-61d95f32c31c-kube-api-access-2x89h" (OuterVolumeSpecName: "kube-api-access-2x89h") pod "93527329-8daf-4a4a-af38-61d95f32c31c" (UID: "93527329-8daf-4a4a-af38-61d95f32c31c"). InnerVolumeSpecName "kube-api-access-2x89h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.282783 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93527329-8daf-4a4a-af38-61d95f32c31c" (UID: "93527329-8daf-4a4a-af38-61d95f32c31c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.356248 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.356301 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x89h\" (UniqueName: \"kubernetes.io/projected/93527329-8daf-4a4a-af38-61d95f32c31c-kube-api-access-2x89h\") on node \"crc\" DevicePath \"\"" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.356314 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93527329-8daf-4a4a-af38-61d95f32c31c-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.437261 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-broadcaster-server-0_baa6ffd5-2b78-4119-b6f1-a70465d5288d/rabbitmq/0.log" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.674560 4842 generic.go:334] "Generic (PLEG): container finished" podID="93527329-8daf-4a4a-af38-61d95f32c31c" containerID="2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc" exitCode=0 Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.674652 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btp4n" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.674661 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btp4n" event={"ID":"93527329-8daf-4a4a-af38-61d95f32c31c","Type":"ContainerDied","Data":"2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc"} Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.675099 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btp4n" event={"ID":"93527329-8daf-4a4a-af38-61d95f32c31c","Type":"ContainerDied","Data":"45142bfc248ffca46926fb63ad9800eb8d0192585f2991412b079814f7589586"} Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.675141 4842 scope.go:117] "RemoveContainer" containerID="2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.677327 4842 generic.go:334] "Generic (PLEG): container finished" podID="6827f555-4985-4131-8bc6-2df2bc76ed73" containerID="7c0a3d1e62e5ae48da938896416b422a3b9be23d54e2edae443225cfc43d93af" exitCode=0 Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.677363 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554284-9c588" event={"ID":"6827f555-4985-4131-8bc6-2df2bc76ed73","Type":"ContainerDied","Data":"7c0a3d1e62e5ae48da938896416b422a3b9be23d54e2edae443225cfc43d93af"} Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.744359 4842 scope.go:117] "RemoveContainer" containerID="afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.745684 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btp4n"] Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.753901 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-btp4n"] Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.764348 4842 scope.go:117] "RemoveContainer" containerID="761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.786896 4842 scope.go:117] "RemoveContainer" containerID="2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc" Mar 11 19:24:02 crc kubenswrapper[4842]: E0311 19:24:02.787483 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc\": container with ID starting with 2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc not found: ID does not exist" containerID="2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.787514 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc"} err="failed to get container status \"2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc\": rpc error: code = NotFound desc = could not find container \"2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc\": container with ID starting with 2a454d188feec745be38d60ada47640846a5edb6551c22feed2a597ab6bf32cc not found: ID does not exist" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.787534 4842 scope.go:117] "RemoveContainer" containerID="afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42" Mar 11 19:24:02 crc kubenswrapper[4842]: E0311 19:24:02.788025 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42\": container with ID starting with afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42 not found: ID does not exist" containerID="afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.788054 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42"} err="failed to get container status \"afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42\": rpc error: code = NotFound desc = could not find container \"afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42\": container with ID starting with afab4234082afe111e49bc244aa4cccbc1def575cb6c04b5140cea74e4357b42 not found: ID does not exist" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.788073 4842 scope.go:117] "RemoveContainer" containerID="761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18" Mar 11 19:24:02 crc kubenswrapper[4842]: E0311 19:24:02.788395 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18\": container with ID starting with 761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18 not found: ID does not exist" containerID="761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.788442 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18"} err="failed to get container status \"761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18\": rpc error: code = NotFound desc = could not find container \"761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18\": container with ID starting with 761424829f50bf1cda061a6e802be608583da72d6acdf3777cc191d8a57e4f18 not found: ID does not exist" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.942392 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-cell1-server-0_e12d431f-86df-44d1-9877-3eb3c698d089/rabbitmq/0.log" Mar 11 19:24:02 crc kubenswrapper[4842]: I0311 19:24:02.977221 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" path="/var/lib/kubelet/pods/93527329-8daf-4a4a-af38-61d95f32c31c/volumes" Mar 11 19:24:03 crc kubenswrapper[4842]: I0311 19:24:03.435662 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-notifications-server-0_8101bb7b-9fb5-418b-b490-e465171babc5/rabbitmq/0.log" Mar 11 19:24:03 crc kubenswrapper[4842]: I0311 19:24:03.934452 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-server-0_13c13109-88f5-4c0d-9c15-739f9622af9d/rabbitmq/0.log" Mar 11 19:24:04 crc kubenswrapper[4842]: I0311 19:24:04.042690 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554284-9c588" Mar 11 19:24:04 crc kubenswrapper[4842]: I0311 19:24:04.186915 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtftj\" (UniqueName: \"kubernetes.io/projected/6827f555-4985-4131-8bc6-2df2bc76ed73-kube-api-access-vtftj\") pod \"6827f555-4985-4131-8bc6-2df2bc76ed73\" (UID: \"6827f555-4985-4131-8bc6-2df2bc76ed73\") " Mar 11 19:24:04 crc kubenswrapper[4842]: I0311 19:24:04.192376 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6827f555-4985-4131-8bc6-2df2bc76ed73-kube-api-access-vtftj" (OuterVolumeSpecName: "kube-api-access-vtftj") pod "6827f555-4985-4131-8bc6-2df2bc76ed73" (UID: "6827f555-4985-4131-8bc6-2df2bc76ed73"). InnerVolumeSpecName "kube-api-access-vtftj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:24:04 crc kubenswrapper[4842]: I0311 19:24:04.289430 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtftj\" (UniqueName: \"kubernetes.io/projected/6827f555-4985-4131-8bc6-2df2bc76ed73-kube-api-access-vtftj\") on node \"crc\" DevicePath \"\"" Mar 11 19:24:04 crc kubenswrapper[4842]: I0311 19:24:04.696584 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554284-9c588" event={"ID":"6827f555-4985-4131-8bc6-2df2bc76ed73","Type":"ContainerDied","Data":"04761fa976fb6bed536e5b161cc8d1dc2b141c2f9ad3947b7afc81e1351e05e7"} Mar 11 19:24:04 crc kubenswrapper[4842]: I0311 19:24:04.696630 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554284-9c588" Mar 11 19:24:04 crc kubenswrapper[4842]: I0311 19:24:04.696641 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04761fa976fb6bed536e5b161cc8d1dc2b141c2f9ad3947b7afc81e1351e05e7" Mar 11 19:24:05 crc kubenswrapper[4842]: I0311 19:24:05.107760 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554278-hgntf"] Mar 11 19:24:05 crc kubenswrapper[4842]: I0311 19:24:05.114856 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554278-hgntf"] Mar 11 19:24:06 crc kubenswrapper[4842]: I0311 19:24:06.986451 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636db3de-b9c6-43f3-8897-39ce019bd74e" path="/var/lib/kubelet/pods/636db3de-b9c6-43f3-8897-39ce019bd74e/volumes" Mar 11 19:24:31 crc kubenswrapper[4842]: I0311 19:24:31.909657 4842 scope.go:117] "RemoveContainer" containerID="209c76ad00e8050ef56d0498c117c3619a868a3e2238905d7977851871e32d90" Mar 11 19:24:31 crc kubenswrapper[4842]: I0311 19:24:31.964472 4842 scope.go:117] "RemoveContainer" containerID="92401beb2ec2ceb19a76e81eebd663897e78135b94a6090782898b4dd81c5018" Mar 11 19:24:34 crc kubenswrapper[4842]: I0311 19:24:34.907367 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/extract/0.log" Mar 11 19:24:35 crc kubenswrapper[4842]: I0311 19:24:35.301294 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/extract/0.log" Mar 11 19:24:42 crc kubenswrapper[4842]: I0311 19:24:42.018822 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-n67cw_a59c06c7-f7ea-4d35-9053-2d969ec7e7f9/manager/0.log" Mar 11 19:24:43 crc kubenswrapper[4842]: I0311 19:24:43.559752 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-sxlvs_cdb4f878-df19-48ad-bd71-88583edeb32a/manager/0.log" Mar 11 19:24:43 crc kubenswrapper[4842]: I0311 19:24:43.949670 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-w9hp6_763d79b9-8982-4ef6-8bc7-c2378f8208f0/manager/0.log" Mar 11 19:24:44 crc kubenswrapper[4842]: I0311 19:24:44.358050 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-m8dg9_de16110e-c77e-4513-b74b-86097ceb5a7d/manager/0.log" Mar 11 19:24:44 crc kubenswrapper[4842]: I0311 19:24:44.784686 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-chrh5_6211c7b4-3c01-49bc-9f4e-59872605f5fe/manager/0.log" Mar 11 19:24:45 crc kubenswrapper[4842]: I0311 19:24:45.184101 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-2vd9b_d87344b8-890b-4457-8f09-ec98bea8300e/manager/0.log" Mar 11 19:24:45 crc kubenswrapper[4842]: I0311 19:24:45.730472 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-dkj58_80959ea3-dca7-4a95-b049-d8df7ebd0ce0/manager/0.log" Mar 11 19:24:46 crc kubenswrapper[4842]: I0311 19:24:46.109890 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-677rf_024796ba-bf60-48db-962e-5d8bf962c127/manager/0.log" Mar 11 19:24:46 crc kubenswrapper[4842]: I0311 19:24:46.546515 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-btk6h_9c018477-14f2-4729-949a-25a46eae03ef/manager/0.log" Mar 11 19:24:46 crc kubenswrapper[4842]: I0311 19:24:46.904963 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-22vbs_bffda318-ec25-4b92-992b-50cf5fb2f6a5/manager/0.log" Mar 11 19:24:47 crc kubenswrapper[4842]: I0311 19:24:47.350632 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-gbfbf_935542fd-daef-458a-b3fe-e2d8291d6c44/manager/0.log" Mar 11 19:24:47 crc kubenswrapper[4842]: I0311 19:24:47.718605 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-qd5nx_ab4b8857-4909-4289-888e-711796d175d8/manager/0.log" Mar 11 19:24:48 crc kubenswrapper[4842]: I0311 19:24:48.561522 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6f598d9474-l5k2t_c4b3af5a-7447-41c9-8cc0-5e927157aecf/manager/0.log" Mar 11 19:24:48 crc kubenswrapper[4842]: I0311 19:24:48.933522 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-index-wln8t_e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46/registry-server/0.log" Mar 11 19:24:49 crc kubenswrapper[4842]: I0311 19:24:49.289185 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-f9xmb_efd1a4f4-f73f-425c-87e9-a63681ca5466/manager/0.log" Mar 11 19:24:49 crc kubenswrapper[4842]: I0311 19:24:49.663416 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b785bdc_463a4e68-9555-4065-aed2-91cdc5570602/manager/0.log" Mar 11 19:24:50 crc kubenswrapper[4842]: I0311 19:24:50.336877 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7547d775f4-htzsf_314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00/manager/0.log" Mar 11 19:24:50 crc kubenswrapper[4842]: I0311 19:24:50.740969 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hvhrl_1b6f8f46-7c23-4380-b8e7-585c3e32ab04/registry-server/0.log" Mar 11 19:24:51 crc kubenswrapper[4842]: I0311 19:24:51.141126 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-m59rl_829f5c10-1f4f-4e84-a6ca-eba63ae106e2/manager/0.log" Mar 11 19:24:51 crc kubenswrapper[4842]: I0311 19:24:51.563575 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-xdc5p_54acfc0e-ae41-490e-ba38-f88a427ff791/manager/0.log" Mar 11 19:24:51 crc kubenswrapper[4842]: I0311 19:24:51.954912 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7srdl_bfbdd09b-00b7-421c-911a-09e9720004f0/operator/0.log" Mar 11 19:24:52 crc kubenswrapper[4842]: I0311 19:24:52.387018 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-h5ksp_56f77fcb-3101-446e-b070-ff1dcda13209/manager/0.log" Mar 11 19:24:52 crc kubenswrapper[4842]: I0311 19:24:52.763731 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-czq5g_868a1fe1-c01f-4a07-b8d5-2d02985cc29d/manager/0.log" Mar 11 19:24:53 crc kubenswrapper[4842]: I0311 19:24:53.160690 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-25rbp_0b197ad5-cb5c-483b-85c9-16578c56dd04/manager/0.log" Mar 11 19:24:53 crc kubenswrapper[4842]: I0311 19:24:53.536386 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-hlvdg_a3e2f9c3-1a9b-441e-87ac-07e25d805293/manager/0.log" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.226396 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tk655"] Mar 11 19:25:07 crc kubenswrapper[4842]: E0311 19:25:07.227287 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="registry-server" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.227305 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="registry-server" Mar 11 19:25:07 crc kubenswrapper[4842]: E0311 19:25:07.227330 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6827f555-4985-4131-8bc6-2df2bc76ed73" containerName="oc" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.227337 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="6827f555-4985-4131-8bc6-2df2bc76ed73" containerName="oc" Mar 11 19:25:07 crc kubenswrapper[4842]: E0311 19:25:07.227354 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="extract-utilities" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.227361 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="extract-utilities" Mar 11 19:25:07 crc kubenswrapper[4842]: E0311 19:25:07.227377 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="extract-content" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.227382 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="extract-content" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.227529 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="6827f555-4985-4131-8bc6-2df2bc76ed73" containerName="oc" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.227547 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="93527329-8daf-4a4a-af38-61d95f32c31c" containerName="registry-server" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.228724 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.244773 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tk655"] Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.330310 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-catalog-content\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.330368 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbbb9\" (UniqueName: \"kubernetes.io/projected/98e71559-4c7f-4d2d-8e4f-539b836f8af2-kube-api-access-kbbb9\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.330427 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-utilities\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.432085 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-utilities\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.432208 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-catalog-content\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.432239 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbb9\" (UniqueName: \"kubernetes.io/projected/98e71559-4c7f-4d2d-8e4f-539b836f8af2-kube-api-access-kbbb9\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.432821 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-catalog-content\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.432862 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-utilities\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.459412 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbbb9\" (UniqueName: \"kubernetes.io/projected/98e71559-4c7f-4d2d-8e4f-539b836f8af2-kube-api-access-kbbb9\") pod \"redhat-operators-tk655\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:07 crc kubenswrapper[4842]: I0311 19:25:07.550741 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:08 crc kubenswrapper[4842]: I0311 19:25:08.004407 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tk655"] Mar 11 19:25:08 crc kubenswrapper[4842]: I0311 19:25:08.265863 4842 generic.go:334] "Generic (PLEG): container finished" podID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerID="70cbecfc7f4d5861aac503fbfee829e815cf2501793d0683dd8cbb93bd5942ed" exitCode=0 Mar 11 19:25:08 crc kubenswrapper[4842]: I0311 19:25:08.265923 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk655" event={"ID":"98e71559-4c7f-4d2d-8e4f-539b836f8af2","Type":"ContainerDied","Data":"70cbecfc7f4d5861aac503fbfee829e815cf2501793d0683dd8cbb93bd5942ed"} Mar 11 19:25:08 crc kubenswrapper[4842]: I0311 19:25:08.266242 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk655" event={"ID":"98e71559-4c7f-4d2d-8e4f-539b836f8af2","Type":"ContainerStarted","Data":"734c0b1c8dcafe93e09b78e5fcff20ee99e8741a19d98862ae14fabec4595fe2"} Mar 11 19:25:10 crc kubenswrapper[4842]: E0311 19:25:10.230911 4842 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98e71559_4c7f_4d2d_8e4f_539b836f8af2.slice/crio-conmon-e56e907aa9926fc6419f321caaf4f19fd3624dfe5631f8ca51eb778748fc9597.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98e71559_4c7f_4d2d_8e4f_539b836f8af2.slice/crio-e56e907aa9926fc6419f321caaf4f19fd3624dfe5631f8ca51eb778748fc9597.scope\": RecentStats: unable to find data in memory cache]" Mar 11 19:25:10 crc kubenswrapper[4842]: I0311 19:25:10.286983 4842 generic.go:334] "Generic (PLEG): container finished" podID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerID="e56e907aa9926fc6419f321caaf4f19fd3624dfe5631f8ca51eb778748fc9597" exitCode=0 Mar 11 19:25:10 crc kubenswrapper[4842]: I0311 19:25:10.287044 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk655" event={"ID":"98e71559-4c7f-4d2d-8e4f-539b836f8af2","Type":"ContainerDied","Data":"e56e907aa9926fc6419f321caaf4f19fd3624dfe5631f8ca51eb778748fc9597"} Mar 11 19:25:11 crc kubenswrapper[4842]: I0311 19:25:11.296476 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk655" event={"ID":"98e71559-4c7f-4d2d-8e4f-539b836f8af2","Type":"ContainerStarted","Data":"f5be037c602ce34e971d85a81e40521f3cb228463772de97f6fe712ef7dfe571"} Mar 11 19:25:11 crc kubenswrapper[4842]: I0311 19:25:11.320138 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tk655" podStartSLOduration=1.714818587 podStartE2EDuration="4.32011972s" podCreationTimestamp="2026-03-11 19:25:07 +0000 UTC" firstStartedPulling="2026-03-11 19:25:08.267564549 +0000 UTC m=+2153.915260829" lastFinishedPulling="2026-03-11 19:25:10.872865682 +0000 UTC m=+2156.520561962" observedRunningTime="2026-03-11 19:25:11.314001047 +0000 UTC m=+2156.961697327" watchObservedRunningTime="2026-03-11 19:25:11.32011972 +0000 UTC m=+2156.967816000" Mar 11 19:25:15 crc kubenswrapper[4842]: I0311 19:25:15.947596 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lxdz8/must-gather-pmx82"] Mar 11 19:25:15 crc kubenswrapper[4842]: I0311 19:25:15.951494 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:15 crc kubenswrapper[4842]: I0311 19:25:15.965991 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lxdz8"/"default-dockercfg-wb87t" Mar 11 19:25:15 crc kubenswrapper[4842]: I0311 19:25:15.966448 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lxdz8"/"openshift-service-ca.crt" Mar 11 19:25:15 crc kubenswrapper[4842]: I0311 19:25:15.966739 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lxdz8"/"kube-root-ca.crt" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.000379 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgfzh\" (UniqueName: \"kubernetes.io/projected/dfb41914-e773-4d85-9875-580c71cd1414-kube-api-access-pgfzh\") pod \"must-gather-pmx82\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.000716 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb41914-e773-4d85-9875-580c71cd1414-must-gather-output\") pod \"must-gather-pmx82\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.014521 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lxdz8/must-gather-pmx82"] Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.103812 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgfzh\" (UniqueName: \"kubernetes.io/projected/dfb41914-e773-4d85-9875-580c71cd1414-kube-api-access-pgfzh\") pod \"must-gather-pmx82\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.103860 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb41914-e773-4d85-9875-580c71cd1414-must-gather-output\") pod \"must-gather-pmx82\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.104270 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb41914-e773-4d85-9875-580c71cd1414-must-gather-output\") pod \"must-gather-pmx82\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.141978 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgfzh\" (UniqueName: \"kubernetes.io/projected/dfb41914-e773-4d85-9875-580c71cd1414-kube-api-access-pgfzh\") pod \"must-gather-pmx82\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.306619 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:25:16 crc kubenswrapper[4842]: I0311 19:25:16.795606 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lxdz8/must-gather-pmx82"] Mar 11 19:25:17 crc kubenswrapper[4842]: I0311 19:25:17.499585 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lxdz8/must-gather-pmx82" event={"ID":"dfb41914-e773-4d85-9875-580c71cd1414","Type":"ContainerStarted","Data":"712d888e67914a586284c4406eed7ad4d36548b236228adfd25a6b966e86cb38"} Mar 11 19:25:17 crc kubenswrapper[4842]: I0311 19:25:17.552301 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:17 crc kubenswrapper[4842]: I0311 19:25:17.552365 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:17 crc kubenswrapper[4842]: I0311 19:25:17.595967 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:18 crc kubenswrapper[4842]: I0311 19:25:18.568230 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:18 crc kubenswrapper[4842]: I0311 19:25:18.619241 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tk655"] Mar 11 19:25:20 crc kubenswrapper[4842]: I0311 19:25:20.525202 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tk655" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="registry-server" containerID="cri-o://f5be037c602ce34e971d85a81e40521f3cb228463772de97f6fe712ef7dfe571" gracePeriod=2 Mar 11 19:25:21 crc kubenswrapper[4842]: I0311 19:25:21.536579 4842 generic.go:334] "Generic (PLEG): container finished" podID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerID="f5be037c602ce34e971d85a81e40521f3cb228463772de97f6fe712ef7dfe571" exitCode=0 Mar 11 19:25:21 crc kubenswrapper[4842]: I0311 19:25:21.536784 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk655" event={"ID":"98e71559-4c7f-4d2d-8e4f-539b836f8af2","Type":"ContainerDied","Data":"f5be037c602ce34e971d85a81e40521f3cb228463772de97f6fe712ef7dfe571"} Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.670020 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.755938 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-catalog-content\") pod \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.756291 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-utilities\") pod \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.756368 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbbb9\" (UniqueName: \"kubernetes.io/projected/98e71559-4c7f-4d2d-8e4f-539b836f8af2-kube-api-access-kbbb9\") pod \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\" (UID: \"98e71559-4c7f-4d2d-8e4f-539b836f8af2\") " Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.757259 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-utilities" (OuterVolumeSpecName: "utilities") pod "98e71559-4c7f-4d2d-8e4f-539b836f8af2" (UID: "98e71559-4c7f-4d2d-8e4f-539b836f8af2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.761651 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e71559-4c7f-4d2d-8e4f-539b836f8af2-kube-api-access-kbbb9" (OuterVolumeSpecName: "kube-api-access-kbbb9") pod "98e71559-4c7f-4d2d-8e4f-539b836f8af2" (UID: "98e71559-4c7f-4d2d-8e4f-539b836f8af2"). InnerVolumeSpecName "kube-api-access-kbbb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.858078 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.858118 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbbb9\" (UniqueName: \"kubernetes.io/projected/98e71559-4c7f-4d2d-8e4f-539b836f8af2-kube-api-access-kbbb9\") on node \"crc\" DevicePath \"\"" Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.886592 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98e71559-4c7f-4d2d-8e4f-539b836f8af2" (UID: "98e71559-4c7f-4d2d-8e4f-539b836f8af2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:25:23 crc kubenswrapper[4842]: I0311 19:25:23.959464 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98e71559-4c7f-4d2d-8e4f-539b836f8af2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.566197 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lxdz8/must-gather-pmx82" event={"ID":"dfb41914-e773-4d85-9875-580c71cd1414","Type":"ContainerStarted","Data":"24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f"} Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.566251 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lxdz8/must-gather-pmx82" event={"ID":"dfb41914-e773-4d85-9875-580c71cd1414","Type":"ContainerStarted","Data":"dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2"} Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.572069 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tk655" event={"ID":"98e71559-4c7f-4d2d-8e4f-539b836f8af2","Type":"ContainerDied","Data":"734c0b1c8dcafe93e09b78e5fcff20ee99e8741a19d98862ae14fabec4595fe2"} Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.572134 4842 scope.go:117] "RemoveContainer" containerID="f5be037c602ce34e971d85a81e40521f3cb228463772de97f6fe712ef7dfe571" Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.572170 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tk655" Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.584541 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lxdz8/must-gather-pmx82" podStartSLOduration=2.9622771329999997 podStartE2EDuration="9.584512768s" podCreationTimestamp="2026-03-11 19:25:15 +0000 UTC" firstStartedPulling="2026-03-11 19:25:16.799320199 +0000 UTC m=+2162.447016479" lastFinishedPulling="2026-03-11 19:25:23.421555834 +0000 UTC m=+2169.069252114" observedRunningTime="2026-03-11 19:25:24.580563699 +0000 UTC m=+2170.228259999" watchObservedRunningTime="2026-03-11 19:25:24.584512768 +0000 UTC m=+2170.232209088" Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.600174 4842 scope.go:117] "RemoveContainer" containerID="e56e907aa9926fc6419f321caaf4f19fd3624dfe5631f8ca51eb778748fc9597" Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.618500 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tk655"] Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.627484 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tk655"] Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.637400 4842 scope.go:117] "RemoveContainer" containerID="70cbecfc7f4d5861aac503fbfee829e815cf2501793d0683dd8cbb93bd5942ed" Mar 11 19:25:24 crc kubenswrapper[4842]: I0311 19:25:24.973323 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" path="/var/lib/kubelet/pods/98e71559-4c7f-4d2d-8e4f-539b836f8af2/volumes" Mar 11 19:25:31 crc kubenswrapper[4842]: I0311 19:25:31.472244 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:25:31 crc kubenswrapper[4842]: I0311 19:25:31.473095 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.109219 4842 scope.go:117] "RemoveContainer" containerID="d3968f4d3635345512a39a183159f9905cf7eb460774c862344b832cffb1b77f" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.131693 4842 scope.go:117] "RemoveContainer" containerID="88d564381d7f26e27d1b3d0a05c9b10613f2d4f1e17139990fda10372a35824b" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.157278 4842 scope.go:117] "RemoveContainer" containerID="ff3c344cd0c2091ce202287f80e01277abde41e1a6484ded7ad6494ae63a4c87" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.214801 4842 scope.go:117] "RemoveContainer" containerID="95be4b65cec964953b8f7bbbc204b195c77a8b2878efc486527749b9f66bcac0" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.234769 4842 scope.go:117] "RemoveContainer" containerID="e21f8e210fa7b0832f3f7981d35cb864a41246531795e7de2350e86a8f09a448" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.271206 4842 scope.go:117] "RemoveContainer" containerID="419057b210958847d4c9e7479bb1c4fbc48301a95ddb46a8ad9cb915ee3442c1" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.294189 4842 scope.go:117] "RemoveContainer" containerID="fa440ff723cf8ad7315429b51c038cdc5747e1375e498ea0f7814f8eb17d80f0" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.316690 4842 scope.go:117] "RemoveContainer" containerID="7f782f0e862a0382c7570ae48b4caf788394118bb273b708be6d0b926d5e9820" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.336556 4842 scope.go:117] "RemoveContainer" containerID="2ba1fa8aecca2b976ca1500d29b35e961cc54ad785288f18ad3d9b7c269a093e" Mar 11 19:25:32 crc kubenswrapper[4842]: I0311 19:25:32.358332 4842 scope.go:117] "RemoveContainer" containerID="775646a243bc97c0eb00108bf2c234d3c6d7cefa0f6f992ab050520c7f84b32f" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.719430 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbkcs"] Mar 11 19:25:50 crc kubenswrapper[4842]: E0311 19:25:50.720370 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="registry-server" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.720386 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="registry-server" Mar 11 19:25:50 crc kubenswrapper[4842]: E0311 19:25:50.720417 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="extract-content" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.720426 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="extract-content" Mar 11 19:25:50 crc kubenswrapper[4842]: E0311 19:25:50.720450 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="extract-utilities" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.720458 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="extract-utilities" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.720650 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e71559-4c7f-4d2d-8e4f-539b836f8af2" containerName="registry-server" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.722154 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.747047 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbkcs"] Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.843066 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x946q\" (UniqueName: \"kubernetes.io/projected/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-kube-api-access-x946q\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.843204 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-catalog-content\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.843228 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-utilities\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.944806 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-catalog-content\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.944868 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-utilities\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.944921 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x946q\" (UniqueName: \"kubernetes.io/projected/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-kube-api-access-x946q\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.945489 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-catalog-content\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.946178 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-utilities\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:50 crc kubenswrapper[4842]: I0311 19:25:50.987072 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x946q\" (UniqueName: \"kubernetes.io/projected/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-kube-api-access-x946q\") pod \"community-operators-tbkcs\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:51 crc kubenswrapper[4842]: I0311 19:25:51.042112 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:25:51 crc kubenswrapper[4842]: I0311 19:25:51.576743 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbkcs"] Mar 11 19:25:51 crc kubenswrapper[4842]: I0311 19:25:51.791774 4842 generic.go:334] "Generic (PLEG): container finished" podID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerID="05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4" exitCode=0 Mar 11 19:25:51 crc kubenswrapper[4842]: I0311 19:25:51.791819 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbkcs" event={"ID":"ec68ab7d-2db6-4cf4-bd32-10b1b030d369","Type":"ContainerDied","Data":"05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4"} Mar 11 19:25:51 crc kubenswrapper[4842]: I0311 19:25:51.791848 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbkcs" event={"ID":"ec68ab7d-2db6-4cf4-bd32-10b1b030d369","Type":"ContainerStarted","Data":"eceba74a2608d67583a6bbd552e15493b3673f273b5b1002a411487cd2197529"} Mar 11 19:25:52 crc kubenswrapper[4842]: I0311 19:25:52.815309 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbkcs" event={"ID":"ec68ab7d-2db6-4cf4-bd32-10b1b030d369","Type":"ContainerStarted","Data":"81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5"} Mar 11 19:25:53 crc kubenswrapper[4842]: I0311 19:25:53.824837 4842 generic.go:334] "Generic (PLEG): container finished" podID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerID="81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5" exitCode=0 Mar 11 19:25:53 crc kubenswrapper[4842]: I0311 19:25:53.824892 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbkcs" event={"ID":"ec68ab7d-2db6-4cf4-bd32-10b1b030d369","Type":"ContainerDied","Data":"81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5"} Mar 11 19:25:54 crc kubenswrapper[4842]: I0311 19:25:54.835115 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbkcs" event={"ID":"ec68ab7d-2db6-4cf4-bd32-10b1b030d369","Type":"ContainerStarted","Data":"0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae"} Mar 11 19:25:54 crc kubenswrapper[4842]: I0311 19:25:54.855525 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbkcs" podStartSLOduration=2.332072159 podStartE2EDuration="4.855505867s" podCreationTimestamp="2026-03-11 19:25:50 +0000 UTC" firstStartedPulling="2026-03-11 19:25:51.793403646 +0000 UTC m=+2197.441099916" lastFinishedPulling="2026-03-11 19:25:54.316837344 +0000 UTC m=+2199.964533624" observedRunningTime="2026-03-11 19:25:54.851919447 +0000 UTC m=+2200.499615727" watchObservedRunningTime="2026-03-11 19:25:54.855505867 +0000 UTC m=+2200.503202147" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.143389 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554286-vhj56"] Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.145313 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554286-vhj56" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.148212 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.148550 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.148932 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.154806 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554286-vhj56"] Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.295305 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b989q\" (UniqueName: \"kubernetes.io/projected/69d5852b-6012-4a40-b7cd-c93401c6669f-kube-api-access-b989q\") pod \"auto-csr-approver-29554286-vhj56\" (UID: \"69d5852b-6012-4a40-b7cd-c93401c6669f\") " pod="openshift-infra/auto-csr-approver-29554286-vhj56" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.396919 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b989q\" (UniqueName: \"kubernetes.io/projected/69d5852b-6012-4a40-b7cd-c93401c6669f-kube-api-access-b989q\") pod \"auto-csr-approver-29554286-vhj56\" (UID: \"69d5852b-6012-4a40-b7cd-c93401c6669f\") " pod="openshift-infra/auto-csr-approver-29554286-vhj56" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.414234 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b989q\" (UniqueName: \"kubernetes.io/projected/69d5852b-6012-4a40-b7cd-c93401c6669f-kube-api-access-b989q\") pod \"auto-csr-approver-29554286-vhj56\" (UID: \"69d5852b-6012-4a40-b7cd-c93401c6669f\") " pod="openshift-infra/auto-csr-approver-29554286-vhj56" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.468057 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554286-vhj56" Mar 11 19:26:00 crc kubenswrapper[4842]: I0311 19:26:00.920402 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554286-vhj56"] Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.042643 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.043614 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.084820 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.471389 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.472351 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.890304 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554286-vhj56" event={"ID":"69d5852b-6012-4a40-b7cd-c93401c6669f","Type":"ContainerStarted","Data":"e06578c012f638b5eccf9fec44fc6e30a3a6d535124f27b0b02a720257f801ef"} Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.929699 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:26:01 crc kubenswrapper[4842]: I0311 19:26:01.971779 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbkcs"] Mar 11 19:26:02 crc kubenswrapper[4842]: I0311 19:26:02.900965 4842 generic.go:334] "Generic (PLEG): container finished" podID="69d5852b-6012-4a40-b7cd-c93401c6669f" containerID="01102a2d073002746d7c2d8097cfb7b19eabafa131a7e9f260837903bdd606c5" exitCode=0 Mar 11 19:26:02 crc kubenswrapper[4842]: I0311 19:26:02.901073 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554286-vhj56" event={"ID":"69d5852b-6012-4a40-b7cd-c93401c6669f","Type":"ContainerDied","Data":"01102a2d073002746d7c2d8097cfb7b19eabafa131a7e9f260837903bdd606c5"} Mar 11 19:26:03 crc kubenswrapper[4842]: I0311 19:26:03.910910 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tbkcs" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="registry-server" containerID="cri-o://0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae" gracePeriod=2 Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.247681 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554286-vhj56" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.363360 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b989q\" (UniqueName: \"kubernetes.io/projected/69d5852b-6012-4a40-b7cd-c93401c6669f-kube-api-access-b989q\") pod \"69d5852b-6012-4a40-b7cd-c93401c6669f\" (UID: \"69d5852b-6012-4a40-b7cd-c93401c6669f\") " Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.369597 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d5852b-6012-4a40-b7cd-c93401c6669f-kube-api-access-b989q" (OuterVolumeSpecName: "kube-api-access-b989q") pod "69d5852b-6012-4a40-b7cd-c93401c6669f" (UID: "69d5852b-6012-4a40-b7cd-c93401c6669f"). InnerVolumeSpecName "kube-api-access-b989q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.452839 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.466187 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b989q\" (UniqueName: \"kubernetes.io/projected/69d5852b-6012-4a40-b7cd-c93401c6669f-kube-api-access-b989q\") on node \"crc\" DevicePath \"\"" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.567675 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x946q\" (UniqueName: \"kubernetes.io/projected/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-kube-api-access-x946q\") pod \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.567817 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-catalog-content\") pod \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.567848 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-utilities\") pod \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\" (UID: \"ec68ab7d-2db6-4cf4-bd32-10b1b030d369\") " Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.568963 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-utilities" (OuterVolumeSpecName: "utilities") pod "ec68ab7d-2db6-4cf4-bd32-10b1b030d369" (UID: "ec68ab7d-2db6-4cf4-bd32-10b1b030d369"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.572369 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-kube-api-access-x946q" (OuterVolumeSpecName: "kube-api-access-x946q") pod "ec68ab7d-2db6-4cf4-bd32-10b1b030d369" (UID: "ec68ab7d-2db6-4cf4-bd32-10b1b030d369"). InnerVolumeSpecName "kube-api-access-x946q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.631259 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec68ab7d-2db6-4cf4-bd32-10b1b030d369" (UID: "ec68ab7d-2db6-4cf4-bd32-10b1b030d369"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.670128 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x946q\" (UniqueName: \"kubernetes.io/projected/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-kube-api-access-x946q\") on node \"crc\" DevicePath \"\"" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.670167 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.670181 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec68ab7d-2db6-4cf4-bd32-10b1b030d369-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.924540 4842 generic.go:334] "Generic (PLEG): container finished" podID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerID="0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae" exitCode=0 Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.924608 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbkcs" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.924610 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbkcs" event={"ID":"ec68ab7d-2db6-4cf4-bd32-10b1b030d369","Type":"ContainerDied","Data":"0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae"} Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.925664 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbkcs" event={"ID":"ec68ab7d-2db6-4cf4-bd32-10b1b030d369","Type":"ContainerDied","Data":"eceba74a2608d67583a6bbd552e15493b3673f273b5b1002a411487cd2197529"} Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.925690 4842 scope.go:117] "RemoveContainer" containerID="0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.927526 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554286-vhj56" event={"ID":"69d5852b-6012-4a40-b7cd-c93401c6669f","Type":"ContainerDied","Data":"e06578c012f638b5eccf9fec44fc6e30a3a6d535124f27b0b02a720257f801ef"} Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.927555 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e06578c012f638b5eccf9fec44fc6e30a3a6d535124f27b0b02a720257f801ef" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.927610 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554286-vhj56" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.947814 4842 scope.go:117] "RemoveContainer" containerID="81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5" Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.975535 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbkcs"] Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.975589 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tbkcs"] Mar 11 19:26:04 crc kubenswrapper[4842]: I0311 19:26:04.983700 4842 scope.go:117] "RemoveContainer" containerID="05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4" Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.003756 4842 scope.go:117] "RemoveContainer" containerID="0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae" Mar 11 19:26:05 crc kubenswrapper[4842]: E0311 19:26:05.004370 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae\": container with ID starting with 0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae not found: ID does not exist" containerID="0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae" Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.004402 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae"} err="failed to get container status \"0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae\": rpc error: code = NotFound desc = could not find container \"0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae\": container with ID starting with 0ae912cb17103f503d9a0bdc2d639ca4fd3ebf459dfcd73f586fc4d8ad2c26ae not found: ID does not exist" Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.004423 4842 scope.go:117] "RemoveContainer" containerID="81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5" Mar 11 19:26:05 crc kubenswrapper[4842]: E0311 19:26:05.004700 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5\": container with ID starting with 81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5 not found: ID does not exist" containerID="81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5" Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.004758 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5"} err="failed to get container status \"81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5\": rpc error: code = NotFound desc = could not find container \"81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5\": container with ID starting with 81a1fc1639f16025b37abc61716087021f90be94261969ca31be27fd3c9a3da5 not found: ID does not exist" Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.004794 4842 scope.go:117] "RemoveContainer" containerID="05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4" Mar 11 19:26:05 crc kubenswrapper[4842]: E0311 19:26:05.005208 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4\": container with ID starting with 05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4 not found: ID does not exist" containerID="05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4" Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.005234 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4"} err="failed to get container status \"05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4\": rpc error: code = NotFound desc = could not find container \"05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4\": container with ID starting with 05333ef8e514e39b70977b09d0cef5cc2b95289a9f57c660148d23439bd529b4 not found: ID does not exist" Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.350291 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554280-75pf5"] Mar 11 19:26:05 crc kubenswrapper[4842]: I0311 19:26:05.357598 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554280-75pf5"] Mar 11 19:26:06 crc kubenswrapper[4842]: I0311 19:26:06.972254 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5d5da09-a866-4784-9c0d-914384019453" path="/var/lib/kubelet/pods/a5d5da09-a866-4784-9c0d-914384019453/volumes" Mar 11 19:26:06 crc kubenswrapper[4842]: I0311 19:26:06.972968 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" path="/var/lib/kubelet/pods/ec68ab7d-2db6-4cf4-bd32-10b1b030d369/volumes" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.191850 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qp62d"] Mar 11 19:26:17 crc kubenswrapper[4842]: E0311 19:26:17.192905 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="extract-content" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.192919 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="extract-content" Mar 11 19:26:17 crc kubenswrapper[4842]: E0311 19:26:17.192932 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="extract-utilities" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.192938 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="extract-utilities" Mar 11 19:26:17 crc kubenswrapper[4842]: E0311 19:26:17.192948 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d5852b-6012-4a40-b7cd-c93401c6669f" containerName="oc" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.192954 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d5852b-6012-4a40-b7cd-c93401c6669f" containerName="oc" Mar 11 19:26:17 crc kubenswrapper[4842]: E0311 19:26:17.192968 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="registry-server" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.192975 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="registry-server" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.193140 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d5852b-6012-4a40-b7cd-c93401c6669f" containerName="oc" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.193169 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec68ab7d-2db6-4cf4-bd32-10b1b030d369" containerName="registry-server" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.194436 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.215463 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp62d"] Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.305069 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-utilities\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.305172 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-catalog-content\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.305219 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk54v\" (UniqueName: \"kubernetes.io/projected/ec03338a-6ad2-404c-85aa-f48c8398a7e4-kube-api-access-wk54v\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.407093 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-utilities\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.407464 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-catalog-content\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.407530 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk54v\" (UniqueName: \"kubernetes.io/projected/ec03338a-6ad2-404c-85aa-f48c8398a7e4-kube-api-access-wk54v\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.407618 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-utilities\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.408102 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-catalog-content\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.427356 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk54v\" (UniqueName: \"kubernetes.io/projected/ec03338a-6ad2-404c-85aa-f48c8398a7e4-kube-api-access-wk54v\") pod \"certified-operators-qp62d\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.513516 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:17 crc kubenswrapper[4842]: I0311 19:26:17.993153 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp62d"] Mar 11 19:26:18 crc kubenswrapper[4842]: I0311 19:26:18.071432 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp62d" event={"ID":"ec03338a-6ad2-404c-85aa-f48c8398a7e4","Type":"ContainerStarted","Data":"7b3e659f116e928044cd16445c032fc50d2d2d327285d30b9524739cf27de76d"} Mar 11 19:26:19 crc kubenswrapper[4842]: I0311 19:26:19.080677 4842 generic.go:334] "Generic (PLEG): container finished" podID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerID="b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc" exitCode=0 Mar 11 19:26:19 crc kubenswrapper[4842]: I0311 19:26:19.080769 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp62d" event={"ID":"ec03338a-6ad2-404c-85aa-f48c8398a7e4","Type":"ContainerDied","Data":"b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc"} Mar 11 19:26:20 crc kubenswrapper[4842]: I0311 19:26:20.101169 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp62d" event={"ID":"ec03338a-6ad2-404c-85aa-f48c8398a7e4","Type":"ContainerStarted","Data":"38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241"} Mar 11 19:26:21 crc kubenswrapper[4842]: I0311 19:26:21.111489 4842 generic.go:334] "Generic (PLEG): container finished" podID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerID="38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241" exitCode=0 Mar 11 19:26:21 crc kubenswrapper[4842]: I0311 19:26:21.111551 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp62d" event={"ID":"ec03338a-6ad2-404c-85aa-f48c8398a7e4","Type":"ContainerDied","Data":"38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241"} Mar 11 19:26:22 crc kubenswrapper[4842]: I0311 19:26:22.122590 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp62d" event={"ID":"ec03338a-6ad2-404c-85aa-f48c8398a7e4","Type":"ContainerStarted","Data":"153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f"} Mar 11 19:26:22 crc kubenswrapper[4842]: I0311 19:26:22.150111 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qp62d" podStartSLOduration=2.6415519830000003 podStartE2EDuration="5.150089287s" podCreationTimestamp="2026-03-11 19:26:17 +0000 UTC" firstStartedPulling="2026-03-11 19:26:19.082842127 +0000 UTC m=+2224.730538407" lastFinishedPulling="2026-03-11 19:26:21.591379431 +0000 UTC m=+2227.239075711" observedRunningTime="2026-03-11 19:26:22.140866805 +0000 UTC m=+2227.788563095" watchObservedRunningTime="2026-03-11 19:26:22.150089287 +0000 UTC m=+2227.797785567" Mar 11 19:26:23 crc kubenswrapper[4842]: I0311 19:26:23.976600 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/util/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.162191 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/pull/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.171914 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/pull/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.196209 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/util/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.403027 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/util/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.416743 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/extract/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.423975 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6f27525c29bce2fcf22856044d1c0e3fc95fa4151a948be36e228281b7f5nf6_e60d545e-d480-44f7-8c67-bba9975dd402/pull/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.603458 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/util/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.802471 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/pull/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.816006 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/util/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.834634 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/pull/0.log" Mar 11 19:26:24 crc kubenswrapper[4842]: I0311 19:26:24.959123 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/util/0.log" Mar 11 19:26:25 crc kubenswrapper[4842]: I0311 19:26:25.003710 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/pull/0.log" Mar 11 19:26:25 crc kubenswrapper[4842]: I0311 19:26:25.012187 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f96ff7ebfd8810e21ab0053011b00e23b9f2771268d62bd7bd4eafb62df7cq_2ab92dd8-8fc7-4aa5-b1df-24683fe9360b/extract/0.log" Mar 11 19:26:25 crc kubenswrapper[4842]: I0311 19:26:25.495233 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-w9hp6_763d79b9-8982-4ef6-8bc7-c2378f8208f0/manager/0.log" Mar 11 19:26:25 crc kubenswrapper[4842]: I0311 19:26:25.719197 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-m8dg9_de16110e-c77e-4513-b74b-86097ceb5a7d/manager/0.log" Mar 11 19:26:25 crc kubenswrapper[4842]: I0311 19:26:25.925934 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-chrh5_6211c7b4-3c01-49bc-9f4e-59872605f5fe/manager/0.log" Mar 11 19:26:26 crc kubenswrapper[4842]: I0311 19:26:26.138307 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-2vd9b_d87344b8-890b-4457-8f09-ec98bea8300e/manager/0.log" Mar 11 19:26:26 crc kubenswrapper[4842]: I0311 19:26:26.670194 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-dkj58_80959ea3-dca7-4a95-b049-d8df7ebd0ce0/manager/0.log" Mar 11 19:26:26 crc kubenswrapper[4842]: I0311 19:26:26.728854 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-sxlvs_cdb4f878-df19-48ad-bd71-88583edeb32a/manager/0.log" Mar 11 19:26:26 crc kubenswrapper[4842]: I0311 19:26:26.840876 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-677rf_024796ba-bf60-48db-962e-5d8bf962c127/manager/0.log" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.110479 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-btk6h_9c018477-14f2-4729-949a-25a46eae03ef/manager/0.log" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.200752 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-22vbs_bffda318-ec25-4b92-992b-50cf5fb2f6a5/manager/0.log" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.460464 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-gbfbf_935542fd-daef-458a-b3fe-e2d8291d6c44/manager/0.log" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.513612 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.513670 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.571151 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.577790 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-qd5nx_ab4b8857-4909-4289-888e-711796d175d8/manager/0.log" Mar 11 19:26:27 crc kubenswrapper[4842]: I0311 19:26:27.907493 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-index-wln8t_e5f44c0d-a601-4f29-a7eb-dc56c3cf3e46/registry-server/0.log" Mar 11 19:26:28 crc kubenswrapper[4842]: I0311 19:26:28.149824 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-f9xmb_efd1a4f4-f73f-425c-87e9-a63681ca5466/manager/0.log" Mar 11 19:26:28 crc kubenswrapper[4842]: I0311 19:26:28.233668 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:28 crc kubenswrapper[4842]: I0311 19:26:28.383705 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-6f598d9474-l5k2t_c4b3af5a-7447-41c9-8cc0-5e927157aecf/manager/0.log" Mar 11 19:26:28 crc kubenswrapper[4842]: I0311 19:26:28.398967 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b785bdc_463a4e68-9555-4065-aed2-91cdc5570602/manager/0.log" Mar 11 19:26:28 crc kubenswrapper[4842]: I0311 19:26:28.670301 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hvhrl_1b6f8f46-7c23-4380-b8e7-585c3e32ab04/registry-server/0.log" Mar 11 19:26:28 crc kubenswrapper[4842]: I0311 19:26:28.891688 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-m59rl_829f5c10-1f4f-4e84-a6ca-eba63ae106e2/manager/0.log" Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.007989 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7547d775f4-htzsf_314e6ccd-fc9e-4fbd-8a69-006e5c0e6c00/manager/0.log" Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.039948 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-xdc5p_54acfc0e-ae41-490e-ba38-f88a427ff791/manager/0.log" Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.137907 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-7srdl_bfbdd09b-00b7-421c-911a-09e9720004f0/operator/0.log" Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.343232 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-h5ksp_56f77fcb-3101-446e-b070-ff1dcda13209/manager/0.log" Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.385196 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-czq5g_868a1fe1-c01f-4a07-b8d5-2d02985cc29d/manager/0.log" Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.610391 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp62d"] Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.781914 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-25rbp_0b197ad5-cb5c-483b-85c9-16578c56dd04/manager/0.log" Mar 11 19:26:29 crc kubenswrapper[4842]: I0311 19:26:29.871599 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-hlvdg_a3e2f9c3-1a9b-441e-87ac-07e25d805293/manager/0.log" Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.208552 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qp62d" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="registry-server" containerID="cri-o://153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f" gracePeriod=2 Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.757781 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.832364 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-catalog-content\") pod \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.832515 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-utilities\") pod \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.832584 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk54v\" (UniqueName: \"kubernetes.io/projected/ec03338a-6ad2-404c-85aa-f48c8398a7e4-kube-api-access-wk54v\") pod \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\" (UID: \"ec03338a-6ad2-404c-85aa-f48c8398a7e4\") " Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.834324 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-utilities" (OuterVolumeSpecName: "utilities") pod "ec03338a-6ad2-404c-85aa-f48c8398a7e4" (UID: "ec03338a-6ad2-404c-85aa-f48c8398a7e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.845656 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec03338a-6ad2-404c-85aa-f48c8398a7e4-kube-api-access-wk54v" (OuterVolumeSpecName: "kube-api-access-wk54v") pod "ec03338a-6ad2-404c-85aa-f48c8398a7e4" (UID: "ec03338a-6ad2-404c-85aa-f48c8398a7e4"). InnerVolumeSpecName "kube-api-access-wk54v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.921630 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec03338a-6ad2-404c-85aa-f48c8398a7e4" (UID: "ec03338a-6ad2-404c-85aa-f48c8398a7e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.934600 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk54v\" (UniqueName: \"kubernetes.io/projected/ec03338a-6ad2-404c-85aa-f48c8398a7e4-kube-api-access-wk54v\") on node \"crc\" DevicePath \"\"" Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.934639 4842 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 19:26:30 crc kubenswrapper[4842]: I0311 19:26:30.934649 4842 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec03338a-6ad2-404c-85aa-f48c8398a7e4-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.220605 4842 generic.go:334] "Generic (PLEG): container finished" podID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerID="153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f" exitCode=0 Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.220652 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp62d" event={"ID":"ec03338a-6ad2-404c-85aa-f48c8398a7e4","Type":"ContainerDied","Data":"153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f"} Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.220681 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp62d" event={"ID":"ec03338a-6ad2-404c-85aa-f48c8398a7e4","Type":"ContainerDied","Data":"7b3e659f116e928044cd16445c032fc50d2d2d327285d30b9524739cf27de76d"} Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.220699 4842 scope.go:117] "RemoveContainer" containerID="153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.220835 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp62d" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.246115 4842 scope.go:117] "RemoveContainer" containerID="38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.248352 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp62d"] Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.283984 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qp62d"] Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.286482 4842 scope.go:117] "RemoveContainer" containerID="b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.304951 4842 scope.go:117] "RemoveContainer" containerID="153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f" Mar 11 19:26:31 crc kubenswrapper[4842]: E0311 19:26:31.305605 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f\": container with ID starting with 153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f not found: ID does not exist" containerID="153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.305651 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f"} err="failed to get container status \"153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f\": rpc error: code = NotFound desc = could not find container \"153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f\": container with ID starting with 153eacb766a80d7a54f5730b75264476374c8cdda75b1a898f47002ee04ba31f not found: ID does not exist" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.305679 4842 scope.go:117] "RemoveContainer" containerID="38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241" Mar 11 19:26:31 crc kubenswrapper[4842]: E0311 19:26:31.308373 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241\": container with ID starting with 38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241 not found: ID does not exist" containerID="38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.308428 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241"} err="failed to get container status \"38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241\": rpc error: code = NotFound desc = could not find container \"38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241\": container with ID starting with 38549b34ea4fd6c1b00d74435bde95432b4355de832486902ba48e6c36276241 not found: ID does not exist" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.308448 4842 scope.go:117] "RemoveContainer" containerID="b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc" Mar 11 19:26:31 crc kubenswrapper[4842]: E0311 19:26:31.308917 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc\": container with ID starting with b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc not found: ID does not exist" containerID="b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.308968 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc"} err="failed to get container status \"b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc\": rpc error: code = NotFound desc = could not find container \"b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc\": container with ID starting with b2740db6714181ea9e6c4728bb1ff7457ba5d9dec2b75abc6f585125d54b24fc not found: ID does not exist" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.472448 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.472511 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.472564 4842 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.473250 4842 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92"} pod="openshift-machine-config-operator/machine-config-daemon-csjgs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.473328 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" containerID="cri-o://68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" gracePeriod=600 Mar 11 19:26:31 crc kubenswrapper[4842]: E0311 19:26:31.610870 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:26:31 crc kubenswrapper[4842]: I0311 19:26:31.749975 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-n67cw_a59c06c7-f7ea-4d35-9053-2d969ec7e7f9/manager/0.log" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.229493 4842 generic.go:334] "Generic (PLEG): container finished" podID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" exitCode=0 Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.229519 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerDied","Data":"68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92"} Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.229881 4842 scope.go:117] "RemoveContainer" containerID="aa35ccc7978f645aca41cd60ffb442586f1c1d0afa2c03aac66ec6981fadd10b" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.230570 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:26:32 crc kubenswrapper[4842]: E0311 19:26:32.230802 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.525636 4842 scope.go:117] "RemoveContainer" containerID="e87336e6f4fc37d8d521a5d41a2980d13d6cc485191609f08bf992630e6a7c76" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.578679 4842 scope.go:117] "RemoveContainer" containerID="bb3cf014b033909853f63dc9436857d7962c07b4e9e816dea85324527c2bd4f5" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.605213 4842 scope.go:117] "RemoveContainer" containerID="59310e84aad42a1f15c234940050ad69e56ab79b8684f43851201e9625bca193" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.633659 4842 scope.go:117] "RemoveContainer" containerID="c52d1549df7ed945b169cd516fa77847b9096f38e9cc5fa49b918dbee880d998" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.675707 4842 scope.go:117] "RemoveContainer" containerID="00df56ef99efcef025e43d4a4421124718ae32918dcc854766eb18bb965154e4" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.720091 4842 scope.go:117] "RemoveContainer" containerID="18240a5aba4e417e66b11f8e254e8b82251caf43c2f4b8a02303a313fe3c2828" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.736777 4842 scope.go:117] "RemoveContainer" containerID="f1ed2a7d9055695e82596670612f7f30bdc582859969e06d8af1b308c701eb42" Mar 11 19:26:32 crc kubenswrapper[4842]: I0311 19:26:32.972724 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" path="/var/lib/kubelet/pods/ec03338a-6ad2-404c-85aa-f48c8398a7e4/volumes" Mar 11 19:26:43 crc kubenswrapper[4842]: I0311 19:26:43.961805 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:26:43 crc kubenswrapper[4842]: E0311 19:26:43.962374 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:26:49 crc kubenswrapper[4842]: I0311 19:26:49.239263 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9hng8_8df2a0ed-e9a6-4322-a130-4d2c8e4b4c55/control-plane-machine-set-operator/0.log" Mar 11 19:26:49 crc kubenswrapper[4842]: I0311 19:26:49.454200 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fmkft_28316cb3-4478-424c-bf38-43d5645ee769/machine-api-operator/0.log" Mar 11 19:26:49 crc kubenswrapper[4842]: I0311 19:26:49.484434 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fmkft_28316cb3-4478-424c-bf38-43d5645ee769/kube-rbac-proxy/0.log" Mar 11 19:26:54 crc kubenswrapper[4842]: I0311 19:26:54.969883 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:26:54 crc kubenswrapper[4842]: E0311 19:26:54.970665 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:27:01 crc kubenswrapper[4842]: I0311 19:27:01.914648 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-xwpqh_921b3579-f653-4d22-8118-31ed2c6ed61c/cert-manager-controller/0.log" Mar 11 19:27:02 crc kubenswrapper[4842]: I0311 19:27:02.113264 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-l4bwp_c8e1262e-1ae7-4979-ae34-605beb8c7c65/cert-manager-webhook/0.log" Mar 11 19:27:02 crc kubenswrapper[4842]: I0311 19:27:02.145601 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-d2sk4_f8f91c28-37fb-4480-aa01-8e5caf168fc4/cert-manager-cainjector/0.log" Mar 11 19:27:08 crc kubenswrapper[4842]: I0311 19:27:08.962613 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:27:08 crc kubenswrapper[4842]: E0311 19:27:08.963344 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:27:15 crc kubenswrapper[4842]: I0311 19:27:15.039024 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-8vdjz_4af38b63-4aad-4175-a610-44575dda0d08/nmstate-console-plugin/0.log" Mar 11 19:27:15 crc kubenswrapper[4842]: I0311 19:27:15.200229 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-nhcr2_caae1282-da5e-4162-960f-500306facaf1/nmstate-handler/0.log" Mar 11 19:27:15 crc kubenswrapper[4842]: I0311 19:27:15.217223 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-2tbdg_ff0ba288-4474-4f9f-bf10-e4955b9142a0/kube-rbac-proxy/0.log" Mar 11 19:27:15 crc kubenswrapper[4842]: I0311 19:27:15.305258 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-2tbdg_ff0ba288-4474-4f9f-bf10-e4955b9142a0/nmstate-metrics/0.log" Mar 11 19:27:15 crc kubenswrapper[4842]: I0311 19:27:15.399226 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zm8f4_db180dd9-3a80-4f69-a4f9-ffbf52edfe72/nmstate-operator/0.log" Mar 11 19:27:15 crc kubenswrapper[4842]: I0311 19:27:15.489610 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rvwdc_405456d2-dac9-4c6b-93fb-ef142b02cd7e/nmstate-webhook/0.log" Mar 11 19:27:20 crc kubenswrapper[4842]: I0311 19:27:20.962021 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:27:20 crc kubenswrapper[4842]: E0311 19:27:20.962648 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:27:32 crc kubenswrapper[4842]: I0311 19:27:32.896316 4842 scope.go:117] "RemoveContainer" containerID="598cae17a2d9fe796c2e269d380af9410e3e9831be084520c46ac9e6fa531b60" Mar 11 19:27:32 crc kubenswrapper[4842]: I0311 19:27:32.951231 4842 scope.go:117] "RemoveContainer" containerID="51468a3d139bd24a2038d0fb68f83f9f112581740abaa04bf221b7e4da368a89" Mar 11 19:27:32 crc kubenswrapper[4842]: I0311 19:27:32.987693 4842 scope.go:117] "RemoveContainer" containerID="038a1fdddc46f534bc00daf359efedf0d6cabbf8e304e639006e1a2d50f69ebe" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.009636 4842 scope.go:117] "RemoveContainer" containerID="f499c78a599c8049e6297f3303f43ba95723052b5b6cb10e70972d546699c221" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.046545 4842 scope.go:117] "RemoveContainer" containerID="3f3fe37d88ff32c4a0913783352ab36605afb090e771605be9e8df6ecb0f7be5" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.090511 4842 scope.go:117] "RemoveContainer" containerID="0f845aa1e8a92d4c660143a765bd14377a89070cdc38649a8c44d216bfbc017e" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.108757 4842 scope.go:117] "RemoveContainer" containerID="11074cf1586d8b964f3ac2b11c803178ea049b5f62ab260fb9b9a5f809f82e76" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.126135 4842 scope.go:117] "RemoveContainer" containerID="e6aba4d908ecd1c794034cb32cb88dff065808bc22b031ecfcf56dcacb306032" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.143379 4842 scope.go:117] "RemoveContainer" containerID="c385a966091517d2c86c87eba300536ca997fafdd28136caf47a7b41b884aab0" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.175312 4842 scope.go:117] "RemoveContainer" containerID="25fbed0be93ebd493774e753f783153d64120d27892582489799a861686f1961" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.192224 4842 scope.go:117] "RemoveContainer" containerID="a6022800a9ccab7df547181656fff0ff01039b471cafe4b9e7acea3e8c90c077" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.228445 4842 scope.go:117] "RemoveContainer" containerID="23cad88cbc670034dc3aca5f656c2fea9ef40add76b7ea8735ee4d3685011e2a" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.245222 4842 scope.go:117] "RemoveContainer" containerID="58fe376c8f9e7c1a2e40edc01459579c9d7b6dac4451e7e1b13657bc47b1870a" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.267802 4842 scope.go:117] "RemoveContainer" containerID="36ac468fe0e0770d4e4de31748ef5f4bc6d049a0fcf9c851625518bb5b4ece62" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.283150 4842 scope.go:117] "RemoveContainer" containerID="3bc7e8588559d0df6449aafc7650dcaeeebc33d95898d976b0d7424329f9aa89" Mar 11 19:27:33 crc kubenswrapper[4842]: I0311 19:27:33.961974 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:27:33 crc kubenswrapper[4842]: E0311 19:27:33.962295 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:27:41 crc kubenswrapper[4842]: I0311 19:27:41.554978 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qgmvl_327901ef-4482-4254-be1f-daa388e6a1f2/kube-rbac-proxy/0.log" Mar 11 19:27:41 crc kubenswrapper[4842]: I0311 19:27:41.697087 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qgmvl_327901ef-4482-4254-be1f-daa388e6a1f2/controller/0.log" Mar 11 19:27:41 crc kubenswrapper[4842]: I0311 19:27:41.814290 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-frr-files/0.log" Mar 11 19:27:41 crc kubenswrapper[4842]: I0311 19:27:41.951430 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-frr-files/0.log" Mar 11 19:27:41 crc kubenswrapper[4842]: I0311 19:27:41.967763 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-reloader/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.024182 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-metrics/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.024626 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-reloader/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.182673 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-metrics/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.218718 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-frr-files/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.219546 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-metrics/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.228535 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-reloader/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.357470 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-frr-files/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.391425 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-reloader/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.411007 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/cp-metrics/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.422396 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/controller/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.562809 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/frr-metrics/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.623359 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/kube-rbac-proxy-frr/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.623793 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/kube-rbac-proxy/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.776851 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/reloader/0.log" Mar 11 19:27:42 crc kubenswrapper[4842]: I0311 19:27:42.806810 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-kmmqf_eca2d6af-43c5-40c2-9589-20e998cdd092/frr-k8s-webhook-server/0.log" Mar 11 19:27:43 crc kubenswrapper[4842]: I0311 19:27:43.041439 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5644775d48-l26mz_7922e71f-79aa-41c2-81c5-539d767c4d0e/manager/0.log" Mar 11 19:27:43 crc kubenswrapper[4842]: I0311 19:27:43.167605 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7856df687f-n2vkc_a2d79a37-4861-45b1-b7fd-7f489497cacf/webhook-server/0.log" Mar 11 19:27:43 crc kubenswrapper[4842]: I0311 19:27:43.291227 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8h7pw_24fd244e-dd50-4270-ad2c-950f5b3f7483/kube-rbac-proxy/0.log" Mar 11 19:27:43 crc kubenswrapper[4842]: I0311 19:27:43.644538 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8h7pw_24fd244e-dd50-4270-ad2c-950f5b3f7483/speaker/0.log" Mar 11 19:27:44 crc kubenswrapper[4842]: I0311 19:27:44.368654 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hkvd5_72b5889e-3eab-414e-ad5d-f6a74b2ec5fe/frr/0.log" Mar 11 19:27:48 crc kubenswrapper[4842]: I0311 19:27:48.961987 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:27:48 crc kubenswrapper[4842]: E0311 19:27:48.963567 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:27:57 crc kubenswrapper[4842]: I0311 19:27:57.039344 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_keystone-64f57b6d8c-cz78k_4291a0cb-5c38-424b-bc49-301aab1e1f1a/keystone-api/0.log" Mar 11 19:27:57 crc kubenswrapper[4842]: I0311 19:27:57.371309 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-cell1-galera-0_0e137603-1bc4-4ccf-ba33-09993a8e6e79/mysql-bootstrap/0.log" Mar 11 19:27:57 crc kubenswrapper[4842]: I0311 19:27:57.527400 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-cell1-galera-0_0e137603-1bc4-4ccf-ba33-09993a8e6e79/mysql-bootstrap/0.log" Mar 11 19:27:57 crc kubenswrapper[4842]: I0311 19:27:57.558731 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-cell1-galera-0_0e137603-1bc4-4ccf-ba33-09993a8e6e79/galera/0.log" Mar 11 19:27:57 crc kubenswrapper[4842]: I0311 19:27:57.920872 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-galera-0_2b22b349-fc5f-4da6-818f-412f7dde5f00/mysql-bootstrap/0.log" Mar 11 19:27:58 crc kubenswrapper[4842]: I0311 19:27:58.093754 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-galera-0_2b22b349-fc5f-4da6-818f-412f7dde5f00/mysql-bootstrap/0.log" Mar 11 19:27:58 crc kubenswrapper[4842]: I0311 19:27:58.136255 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstack-galera-0_2b22b349-fc5f-4da6-818f-412f7dde5f00/galera/0.log" Mar 11 19:27:58 crc kubenswrapper[4842]: I0311 19:27:58.305832 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_openstackclient_b7dcae57-2024-4bfa-b657-f16d16bfd6c7/openstackclient/0.log" Mar 11 19:27:58 crc kubenswrapper[4842]: I0311 19:27:58.444648 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_placement-79b56db87d-ltvb2_35af45e3-739f-4769-a843-c951ad001e2e/placement-api/0.log" Mar 11 19:27:58 crc kubenswrapper[4842]: I0311 19:27:58.564381 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_placement-79b56db87d-ltvb2_35af45e3-739f-4769-a843-c951ad001e2e/placement-log/0.log" Mar 11 19:27:58 crc kubenswrapper[4842]: I0311 19:27:58.690757 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-broadcaster-server-0_baa6ffd5-2b78-4119-b6f1-a70465d5288d/setup-container/0.log" Mar 11 19:27:58 crc kubenswrapper[4842]: I0311 19:27:58.883878 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-broadcaster-server-0_baa6ffd5-2b78-4119-b6f1-a70465d5288d/setup-container/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.008570 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-broadcaster-server-0_baa6ffd5-2b78-4119-b6f1-a70465d5288d/rabbitmq/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.127148 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-cell1-server-0_e12d431f-86df-44d1-9877-3eb3c698d089/setup-container/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.285660 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-cell1-server-0_e12d431f-86df-44d1-9877-3eb3c698d089/setup-container/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.310135 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-cell1-server-0_e12d431f-86df-44d1-9877-3eb3c698d089/rabbitmq/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.559725 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-notifications-server-0_8101bb7b-9fb5-418b-b490-e465171babc5/setup-container/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.615865 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_memcached-0_6f57e7eb-fa53-4182-9531-a3ebcd1df17c/memcached/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.724337 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-notifications-server-0_8101bb7b-9fb5-418b-b490-e465171babc5/setup-container/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.798364 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-notifications-server-0_8101bb7b-9fb5-418b-b490-e465171babc5/rabbitmq/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.817672 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-server-0_13c13109-88f5-4c0d-9c15-739f9622af9d/setup-container/0.log" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.963121 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:27:59 crc kubenswrapper[4842]: E0311 19:27:59.963422 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:27:59 crc kubenswrapper[4842]: I0311 19:27:59.984108 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-server-0_13c13109-88f5-4c0d-9c15-739f9622af9d/setup-container/0.log" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.008551 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/nova-kuttl-default_rabbitmq-server-0_13c13109-88f5-4c0d-9c15-739f9622af9d/rabbitmq/0.log" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.141089 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554288-qlws8"] Mar 11 19:28:00 crc kubenswrapper[4842]: E0311 19:28:00.141521 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="registry-server" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.141537 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="registry-server" Mar 11 19:28:00 crc kubenswrapper[4842]: E0311 19:28:00.141562 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="extract-content" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.141570 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="extract-content" Mar 11 19:28:00 crc kubenswrapper[4842]: E0311 19:28:00.141614 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="extract-utilities" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.141624 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="extract-utilities" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.141811 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec03338a-6ad2-404c-85aa-f48c8398a7e4" containerName="registry-server" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.142541 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554288-qlws8" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.145382 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.145467 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.145386 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.148560 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554288-qlws8"] Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.180065 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvqn4\" (UniqueName: \"kubernetes.io/projected/70a41747-fa41-48aa-b29a-715511817938-kube-api-access-mvqn4\") pod \"auto-csr-approver-29554288-qlws8\" (UID: \"70a41747-fa41-48aa-b29a-715511817938\") " pod="openshift-infra/auto-csr-approver-29554288-qlws8" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.282046 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvqn4\" (UniqueName: \"kubernetes.io/projected/70a41747-fa41-48aa-b29a-715511817938-kube-api-access-mvqn4\") pod \"auto-csr-approver-29554288-qlws8\" (UID: \"70a41747-fa41-48aa-b29a-715511817938\") " pod="openshift-infra/auto-csr-approver-29554288-qlws8" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.302964 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvqn4\" (UniqueName: \"kubernetes.io/projected/70a41747-fa41-48aa-b29a-715511817938-kube-api-access-mvqn4\") pod \"auto-csr-approver-29554288-qlws8\" (UID: \"70a41747-fa41-48aa-b29a-715511817938\") " pod="openshift-infra/auto-csr-approver-29554288-qlws8" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.463824 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554288-qlws8" Mar 11 19:28:00 crc kubenswrapper[4842]: I0311 19:28:00.928430 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554288-qlws8"] Mar 11 19:28:00 crc kubenswrapper[4842]: W0311 19:28:00.930443 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70a41747_fa41_48aa_b29a_715511817938.slice/crio-202fc74c93f4d5cc158c87028c0b9881283feab24a0a65d3472ccdb57f46e8fb WatchSource:0}: Error finding container 202fc74c93f4d5cc158c87028c0b9881283feab24a0a65d3472ccdb57f46e8fb: Status 404 returned error can't find the container with id 202fc74c93f4d5cc158c87028c0b9881283feab24a0a65d3472ccdb57f46e8fb Mar 11 19:28:01 crc kubenswrapper[4842]: I0311 19:28:01.893443 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554288-qlws8" event={"ID":"70a41747-fa41-48aa-b29a-715511817938","Type":"ContainerStarted","Data":"202fc74c93f4d5cc158c87028c0b9881283feab24a0a65d3472ccdb57f46e8fb"} Mar 11 19:28:02 crc kubenswrapper[4842]: I0311 19:28:02.903727 4842 generic.go:334] "Generic (PLEG): container finished" podID="70a41747-fa41-48aa-b29a-715511817938" containerID="f04295390938f31957090ffd7a011c0bd93dcb6109ac9c04dca44b3965d72694" exitCode=0 Mar 11 19:28:02 crc kubenswrapper[4842]: I0311 19:28:02.903834 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554288-qlws8" event={"ID":"70a41747-fa41-48aa-b29a-715511817938","Type":"ContainerDied","Data":"f04295390938f31957090ffd7a011c0bd93dcb6109ac9c04dca44b3965d72694"} Mar 11 19:28:04 crc kubenswrapper[4842]: I0311 19:28:04.186919 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554288-qlws8" Mar 11 19:28:04 crc kubenswrapper[4842]: I0311 19:28:04.247861 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvqn4\" (UniqueName: \"kubernetes.io/projected/70a41747-fa41-48aa-b29a-715511817938-kube-api-access-mvqn4\") pod \"70a41747-fa41-48aa-b29a-715511817938\" (UID: \"70a41747-fa41-48aa-b29a-715511817938\") " Mar 11 19:28:04 crc kubenswrapper[4842]: I0311 19:28:04.254563 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a41747-fa41-48aa-b29a-715511817938-kube-api-access-mvqn4" (OuterVolumeSpecName: "kube-api-access-mvqn4") pod "70a41747-fa41-48aa-b29a-715511817938" (UID: "70a41747-fa41-48aa-b29a-715511817938"). InnerVolumeSpecName "kube-api-access-mvqn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:28:04 crc kubenswrapper[4842]: I0311 19:28:04.349575 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvqn4\" (UniqueName: \"kubernetes.io/projected/70a41747-fa41-48aa-b29a-715511817938-kube-api-access-mvqn4\") on node \"crc\" DevicePath \"\"" Mar 11 19:28:04 crc kubenswrapper[4842]: I0311 19:28:04.920865 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554288-qlws8" event={"ID":"70a41747-fa41-48aa-b29a-715511817938","Type":"ContainerDied","Data":"202fc74c93f4d5cc158c87028c0b9881283feab24a0a65d3472ccdb57f46e8fb"} Mar 11 19:28:04 crc kubenswrapper[4842]: I0311 19:28:04.920905 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554288-qlws8" Mar 11 19:28:04 crc kubenswrapper[4842]: I0311 19:28:04.920925 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="202fc74c93f4d5cc158c87028c0b9881283feab24a0a65d3472ccdb57f46e8fb" Mar 11 19:28:05 crc kubenswrapper[4842]: I0311 19:28:05.259780 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554282-lq6cv"] Mar 11 19:28:05 crc kubenswrapper[4842]: I0311 19:28:05.267019 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554282-lq6cv"] Mar 11 19:28:06 crc kubenswrapper[4842]: I0311 19:28:06.971091 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="828994bf-fa6e-4227-8b01-dc6d1b3b9cec" path="/var/lib/kubelet/pods/828994bf-fa6e-4227-8b01-dc6d1b3b9cec/volumes" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.350471 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm_edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5/util/0.log" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.516320 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm_edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5/util/0.log" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.528118 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm_edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5/pull/0.log" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.605861 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm_edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5/pull/0.log" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.771890 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm_edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5/extract/0.log" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.783987 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm_edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5/util/0.log" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.785171 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874knpbm_edd54e0b-3b1b-41f5-bd4d-b50b002ea4c5/pull/0.log" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.963102 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:28:13 crc kubenswrapper[4842]: E0311 19:28:13.963465 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:28:13 crc kubenswrapper[4842]: I0311 19:28:13.976459 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk_255a795a-3aab-4e6f-a5b8-4baecc18d798/util/0.log" Mar 11 19:28:14 crc kubenswrapper[4842]: I0311 19:28:14.181418 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk_255a795a-3aab-4e6f-a5b8-4baecc18d798/util/0.log" Mar 11 19:28:14 crc kubenswrapper[4842]: I0311 19:28:14.188368 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk_255a795a-3aab-4e6f-a5b8-4baecc18d798/pull/0.log" Mar 11 19:28:14 crc kubenswrapper[4842]: I0311 19:28:14.188689 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk_255a795a-3aab-4e6f-a5b8-4baecc18d798/pull/0.log" Mar 11 19:28:14 crc kubenswrapper[4842]: I0311 19:28:14.345931 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk_255a795a-3aab-4e6f-a5b8-4baecc18d798/util/0.log" Mar 11 19:28:15 crc kubenswrapper[4842]: I0311 19:28:15.477577 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m8dg9" podUID="de16110e-c77e-4513-b74b-86097ceb5a7d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.75:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:28:15 crc kubenswrapper[4842]: I0311 19:28:15.647709 4842 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9xmb" podUID="efd1a4f4-f73f-425c-87e9-a63681ca5466" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.84:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 19:28:15 crc kubenswrapper[4842]: I0311 19:28:15.822405 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v_65175dce-175c-44e3-b1f1-a3f3607e0a25/util/0.log" Mar 11 19:28:15 crc kubenswrapper[4842]: I0311 19:28:15.823543 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk_255a795a-3aab-4e6f-a5b8-4baecc18d798/extract/0.log" Mar 11 19:28:15 crc kubenswrapper[4842]: I0311 19:28:15.987812 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v_65175dce-175c-44e3-b1f1-a3f3607e0a25/util/0.log" Mar 11 19:28:15 crc kubenswrapper[4842]: I0311 19:28:15.988314 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1776fk_255a795a-3aab-4e6f-a5b8-4baecc18d798/pull/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.049967 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v_65175dce-175c-44e3-b1f1-a3f3607e0a25/pull/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.128898 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v_65175dce-175c-44e3-b1f1-a3f3607e0a25/pull/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.230088 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v_65175dce-175c-44e3-b1f1-a3f3607e0a25/extract/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.248294 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v_65175dce-175c-44e3-b1f1-a3f3607e0a25/util/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.269332 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twm9v_65175dce-175c-44e3-b1f1-a3f3607e0a25/pull/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.415765 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c75p2_503b63c7-1278-4eec-84bc-86223fe3ad04/extract-utilities/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.582485 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c75p2_503b63c7-1278-4eec-84bc-86223fe3ad04/extract-utilities/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.614415 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c75p2_503b63c7-1278-4eec-84bc-86223fe3ad04/extract-content/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.620630 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c75p2_503b63c7-1278-4eec-84bc-86223fe3ad04/extract-content/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.774094 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c75p2_503b63c7-1278-4eec-84bc-86223fe3ad04/extract-content/0.log" Mar 11 19:28:16 crc kubenswrapper[4842]: I0311 19:28:16.774107 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c75p2_503b63c7-1278-4eec-84bc-86223fe3ad04/extract-utilities/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.023100 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jtlqq_16d57199-f634-477a-a8d1-5c1f6c97f24b/extract-utilities/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.141752 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-c75p2_503b63c7-1278-4eec-84bc-86223fe3ad04/registry-server/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.226701 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jtlqq_16d57199-f634-477a-a8d1-5c1f6c97f24b/extract-content/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.230894 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jtlqq_16d57199-f634-477a-a8d1-5c1f6c97f24b/extract-utilities/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.289668 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jtlqq_16d57199-f634-477a-a8d1-5c1f6c97f24b/extract-content/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.423232 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jtlqq_16d57199-f634-477a-a8d1-5c1f6c97f24b/extract-content/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.428348 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jtlqq_16d57199-f634-477a-a8d1-5c1f6c97f24b/extract-utilities/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.627891 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nqsd8_e45c9752-2328-495a-88dd-a6c769b7f012/marketplace-operator/0.log" Mar 11 19:28:17 crc kubenswrapper[4842]: I0311 19:28:17.769654 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kzpq5_b55039ae-0af8-42cc-a100-7b8893fb9400/extract-utilities/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.056765 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kzpq5_b55039ae-0af8-42cc-a100-7b8893fb9400/extract-utilities/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.087596 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kzpq5_b55039ae-0af8-42cc-a100-7b8893fb9400/extract-content/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.104577 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kzpq5_b55039ae-0af8-42cc-a100-7b8893fb9400/extract-content/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.261118 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jtlqq_16d57199-f634-477a-a8d1-5c1f6c97f24b/registry-server/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.310561 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kzpq5_b55039ae-0af8-42cc-a100-7b8893fb9400/extract-content/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.336102 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kzpq5_b55039ae-0af8-42cc-a100-7b8893fb9400/extract-utilities/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.435250 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-kzpq5_b55039ae-0af8-42cc-a100-7b8893fb9400/registry-server/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.515517 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jzgww_f914e9d1-11a6-46bd-af88-7a238ade220f/extract-utilities/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.697520 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jzgww_f914e9d1-11a6-46bd-af88-7a238ade220f/extract-content/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.709303 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jzgww_f914e9d1-11a6-46bd-af88-7a238ade220f/extract-utilities/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.709535 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jzgww_f914e9d1-11a6-46bd-af88-7a238ade220f/extract-content/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.845934 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jzgww_f914e9d1-11a6-46bd-af88-7a238ade220f/extract-utilities/0.log" Mar 11 19:28:18 crc kubenswrapper[4842]: I0311 19:28:18.862995 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jzgww_f914e9d1-11a6-46bd-af88-7a238ade220f/extract-content/0.log" Mar 11 19:28:19 crc kubenswrapper[4842]: I0311 19:28:19.434369 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jzgww_f914e9d1-11a6-46bd-af88-7a238ade220f/registry-server/0.log" Mar 11 19:28:28 crc kubenswrapper[4842]: I0311 19:28:28.962886 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:28:28 crc kubenswrapper[4842]: E0311 19:28:28.963731 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:28:33 crc kubenswrapper[4842]: I0311 19:28:33.503840 4842 scope.go:117] "RemoveContainer" containerID="d13644760e036f5c2aeb72c6729f9ae6de24b5bc469328939bee3e7c981dba6e" Mar 11 19:28:40 crc kubenswrapper[4842]: I0311 19:28:40.967264 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:28:40 crc kubenswrapper[4842]: E0311 19:28:40.968017 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:28:51 crc kubenswrapper[4842]: I0311 19:28:51.962110 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:28:51 crc kubenswrapper[4842]: E0311 19:28:51.963676 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:29:05 crc kubenswrapper[4842]: I0311 19:29:05.962297 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:29:05 crc kubenswrapper[4842]: E0311 19:29:05.963075 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:29:17 crc kubenswrapper[4842]: I0311 19:29:17.963133 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:29:17 crc kubenswrapper[4842]: E0311 19:29:17.964326 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:29:30 crc kubenswrapper[4842]: I0311 19:29:30.962962 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:29:30 crc kubenswrapper[4842]: E0311 19:29:30.965254 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:29:33 crc kubenswrapper[4842]: I0311 19:29:33.574516 4842 scope.go:117] "RemoveContainer" containerID="1854ee38df937e7f30868ec7405a09f697d8be7cf595110682c45572dd57640d" Mar 11 19:29:33 crc kubenswrapper[4842]: I0311 19:29:33.598402 4842 scope.go:117] "RemoveContainer" containerID="b01b8cf1f8126371cd36db897da5cf7c85e239890ae9badcaa7e67558892f64e" Mar 11 19:29:33 crc kubenswrapper[4842]: I0311 19:29:33.638326 4842 scope.go:117] "RemoveContainer" containerID="2aea09330816f211682dc660c7039ee0c2b32e7b3dd1c462dd23655d4aecc2e3" Mar 11 19:29:33 crc kubenswrapper[4842]: I0311 19:29:33.680231 4842 scope.go:117] "RemoveContainer" containerID="fedcfbcbcc9e3ed87d00733edec9f92582c4f7fd06b9f805dacbb50a57ffe3a0" Mar 11 19:29:42 crc kubenswrapper[4842]: I0311 19:29:42.659191 4842 generic.go:334] "Generic (PLEG): container finished" podID="dfb41914-e773-4d85-9875-580c71cd1414" containerID="dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2" exitCode=0 Mar 11 19:29:42 crc kubenswrapper[4842]: I0311 19:29:42.659306 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lxdz8/must-gather-pmx82" event={"ID":"dfb41914-e773-4d85-9875-580c71cd1414","Type":"ContainerDied","Data":"dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2"} Mar 11 19:29:42 crc kubenswrapper[4842]: I0311 19:29:42.660352 4842 scope.go:117] "RemoveContainer" containerID="dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2" Mar 11 19:29:43 crc kubenswrapper[4842]: I0311 19:29:43.110506 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lxdz8_must-gather-pmx82_dfb41914-e773-4d85-9875-580c71cd1414/gather/0.log" Mar 11 19:29:43 crc kubenswrapper[4842]: I0311 19:29:43.962712 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:29:43 crc kubenswrapper[4842]: E0311 19:29:43.962954 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.288645 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lxdz8/must-gather-pmx82"] Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.289138 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lxdz8/must-gather-pmx82" podUID="dfb41914-e773-4d85-9875-580c71cd1414" containerName="copy" containerID="cri-o://24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f" gracePeriod=2 Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.298511 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lxdz8/must-gather-pmx82"] Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.715018 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lxdz8_must-gather-pmx82_dfb41914-e773-4d85-9875-580c71cd1414/copy/0.log" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.715401 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.731122 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgfzh\" (UniqueName: \"kubernetes.io/projected/dfb41914-e773-4d85-9875-580c71cd1414-kube-api-access-pgfzh\") pod \"dfb41914-e773-4d85-9875-580c71cd1414\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.731194 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb41914-e773-4d85-9875-580c71cd1414-must-gather-output\") pod \"dfb41914-e773-4d85-9875-580c71cd1414\" (UID: \"dfb41914-e773-4d85-9875-580c71cd1414\") " Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.737735 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb41914-e773-4d85-9875-580c71cd1414-kube-api-access-pgfzh" (OuterVolumeSpecName: "kube-api-access-pgfzh") pod "dfb41914-e773-4d85-9875-580c71cd1414" (UID: "dfb41914-e773-4d85-9875-580c71cd1414"). InnerVolumeSpecName "kube-api-access-pgfzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.832799 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgfzh\" (UniqueName: \"kubernetes.io/projected/dfb41914-e773-4d85-9875-580c71cd1414-kube-api-access-pgfzh\") on node \"crc\" DevicePath \"\"" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.848170 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb41914-e773-4d85-9875-580c71cd1414-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfb41914-e773-4d85-9875-580c71cd1414" (UID: "dfb41914-e773-4d85-9875-580c71cd1414"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.890361 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lxdz8_must-gather-pmx82_dfb41914-e773-4d85-9875-580c71cd1414/copy/0.log" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.890853 4842 generic.go:334] "Generic (PLEG): container finished" podID="dfb41914-e773-4d85-9875-580c71cd1414" containerID="24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f" exitCode=143 Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.890915 4842 scope.go:117] "RemoveContainer" containerID="24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.891105 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lxdz8/must-gather-pmx82" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.911518 4842 scope.go:117] "RemoveContainer" containerID="dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.936183 4842 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfb41914-e773-4d85-9875-580c71cd1414-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.979784 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb41914-e773-4d85-9875-580c71cd1414" path="/var/lib/kubelet/pods/dfb41914-e773-4d85-9875-580c71cd1414/volumes" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.997090 4842 scope.go:117] "RemoveContainer" containerID="24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f" Mar 11 19:29:52 crc kubenswrapper[4842]: E0311 19:29:52.997807 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f\": container with ID starting with 24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f not found: ID does not exist" containerID="24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.997867 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f"} err="failed to get container status \"24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f\": rpc error: code = NotFound desc = could not find container \"24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f\": container with ID starting with 24a65162ab12bb214fb46f95973e7f371e55ae637c34a8b6eeda22fa14a6181f not found: ID does not exist" Mar 11 19:29:52 crc kubenswrapper[4842]: I0311 19:29:52.997899 4842 scope.go:117] "RemoveContainer" containerID="dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2" Mar 11 19:29:53 crc kubenswrapper[4842]: E0311 19:29:52.999939 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2\": container with ID starting with dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2 not found: ID does not exist" containerID="dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2" Mar 11 19:29:53 crc kubenswrapper[4842]: I0311 19:29:52.999999 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2"} err="failed to get container status \"dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2\": rpc error: code = NotFound desc = could not find container \"dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2\": container with ID starting with dc8603860682edfb07591c69d2dd65dbca9f856bbfdaf80cee8c90c1542dbca2 not found: ID does not exist" Mar 11 19:29:56 crc kubenswrapper[4842]: I0311 19:29:56.962482 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:29:56 crc kubenswrapper[4842]: E0311 19:29:56.963220 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.143921 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554290-8gsxw"] Mar 11 19:30:00 crc kubenswrapper[4842]: E0311 19:30:00.145045 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a41747-fa41-48aa-b29a-715511817938" containerName="oc" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.145060 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a41747-fa41-48aa-b29a-715511817938" containerName="oc" Mar 11 19:30:00 crc kubenswrapper[4842]: E0311 19:30:00.145077 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb41914-e773-4d85-9875-580c71cd1414" containerName="copy" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.145085 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb41914-e773-4d85-9875-580c71cd1414" containerName="copy" Mar 11 19:30:00 crc kubenswrapper[4842]: E0311 19:30:00.145108 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb41914-e773-4d85-9875-580c71cd1414" containerName="gather" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.145118 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb41914-e773-4d85-9875-580c71cd1414" containerName="gather" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.145312 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb41914-e773-4d85-9875-580c71cd1414" containerName="copy" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.145332 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb41914-e773-4d85-9875-580c71cd1414" containerName="gather" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.145342 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a41747-fa41-48aa-b29a-715511817938" containerName="oc" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.146125 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554290-8gsxw" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.148670 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.149218 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.149361 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.162992 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp"] Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.164302 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.167797 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.167890 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.168462 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rxsn\" (UniqueName: \"kubernetes.io/projected/b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63-kube-api-access-8rxsn\") pod \"auto-csr-approver-29554290-8gsxw\" (UID: \"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63\") " pod="openshift-infra/auto-csr-approver-29554290-8gsxw" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.176447 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554290-8gsxw"] Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.190764 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp"] Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.272230 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bb8f502-8e11-4760-9bc2-2ef90240dc16-config-volume\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.272348 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rxsn\" (UniqueName: \"kubernetes.io/projected/b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63-kube-api-access-8rxsn\") pod \"auto-csr-approver-29554290-8gsxw\" (UID: \"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63\") " pod="openshift-infra/auto-csr-approver-29554290-8gsxw" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.272434 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bb8f502-8e11-4760-9bc2-2ef90240dc16-secret-volume\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.272496 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6gz\" (UniqueName: \"kubernetes.io/projected/8bb8f502-8e11-4760-9bc2-2ef90240dc16-kube-api-access-zs6gz\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.296515 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rxsn\" (UniqueName: \"kubernetes.io/projected/b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63-kube-api-access-8rxsn\") pod \"auto-csr-approver-29554290-8gsxw\" (UID: \"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63\") " pod="openshift-infra/auto-csr-approver-29554290-8gsxw" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.373576 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bb8f502-8e11-4760-9bc2-2ef90240dc16-secret-volume\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.373931 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6gz\" (UniqueName: \"kubernetes.io/projected/8bb8f502-8e11-4760-9bc2-2ef90240dc16-kube-api-access-zs6gz\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.374002 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bb8f502-8e11-4760-9bc2-2ef90240dc16-config-volume\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.374882 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bb8f502-8e11-4760-9bc2-2ef90240dc16-config-volume\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.378955 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bb8f502-8e11-4760-9bc2-2ef90240dc16-secret-volume\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.399129 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6gz\" (UniqueName: \"kubernetes.io/projected/8bb8f502-8e11-4760-9bc2-2ef90240dc16-kube-api-access-zs6gz\") pod \"collect-profiles-29554290-g45xp\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.463421 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554290-8gsxw" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.481364 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.768954 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp"] Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.910912 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554290-8gsxw"] Mar 11 19:30:00 crc kubenswrapper[4842]: W0311 19:30:00.913508 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3ec4a8f_e0fd_447f_b6d0_67a0439f7b63.slice/crio-87f910e5f0b817e98bb917756d92f070a0cabdc6a23e577ef86f0857c31dcb98 WatchSource:0}: Error finding container 87f910e5f0b817e98bb917756d92f070a0cabdc6a23e577ef86f0857c31dcb98: Status 404 returned error can't find the container with id 87f910e5f0b817e98bb917756d92f070a0cabdc6a23e577ef86f0857c31dcb98 Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.920134 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.959891 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" event={"ID":"8bb8f502-8e11-4760-9bc2-2ef90240dc16","Type":"ContainerStarted","Data":"03bd2094dc9b6bc5db32c17919622f599eb9a1d77cc103e66bcdd6c06cc5f1dd"} Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.960364 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" event={"ID":"8bb8f502-8e11-4760-9bc2-2ef90240dc16","Type":"ContainerStarted","Data":"0a2011b5a1a46189d6ec53ad16dd0229c70a97f402eb9e2a5fd68d1ac65ea348"} Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.963087 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554290-8gsxw" event={"ID":"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63","Type":"ContainerStarted","Data":"87f910e5f0b817e98bb917756d92f070a0cabdc6a23e577ef86f0857c31dcb98"} Mar 11 19:30:00 crc kubenswrapper[4842]: I0311 19:30:00.985077 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" podStartSLOduration=0.984930237 podStartE2EDuration="984.930237ms" podCreationTimestamp="2026-03-11 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 19:30:00.981463735 +0000 UTC m=+2446.629160015" watchObservedRunningTime="2026-03-11 19:30:00.984930237 +0000 UTC m=+2446.632626517" Mar 11 19:30:01 crc kubenswrapper[4842]: I0311 19:30:01.972206 4842 generic.go:334] "Generic (PLEG): container finished" podID="8bb8f502-8e11-4760-9bc2-2ef90240dc16" containerID="03bd2094dc9b6bc5db32c17919622f599eb9a1d77cc103e66bcdd6c06cc5f1dd" exitCode=0 Mar 11 19:30:01 crc kubenswrapper[4842]: I0311 19:30:01.972308 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" event={"ID":"8bb8f502-8e11-4760-9bc2-2ef90240dc16","Type":"ContainerDied","Data":"03bd2094dc9b6bc5db32c17919622f599eb9a1d77cc103e66bcdd6c06cc5f1dd"} Mar 11 19:30:02 crc kubenswrapper[4842]: I0311 19:30:02.987845 4842 generic.go:334] "Generic (PLEG): container finished" podID="b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63" containerID="7ab1c0bb77714e1f587d38170ee934cb54c0675244ef6427da4415e16a969853" exitCode=0 Mar 11 19:30:02 crc kubenswrapper[4842]: I0311 19:30:02.987951 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554290-8gsxw" event={"ID":"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63","Type":"ContainerDied","Data":"7ab1c0bb77714e1f587d38170ee934cb54c0675244ef6427da4415e16a969853"} Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.266368 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.429388 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs6gz\" (UniqueName: \"kubernetes.io/projected/8bb8f502-8e11-4760-9bc2-2ef90240dc16-kube-api-access-zs6gz\") pod \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.429904 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bb8f502-8e11-4760-9bc2-2ef90240dc16-secret-volume\") pod \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.429984 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bb8f502-8e11-4760-9bc2-2ef90240dc16-config-volume\") pod \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\" (UID: \"8bb8f502-8e11-4760-9bc2-2ef90240dc16\") " Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.430614 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb8f502-8e11-4760-9bc2-2ef90240dc16-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bb8f502-8e11-4760-9bc2-2ef90240dc16" (UID: "8bb8f502-8e11-4760-9bc2-2ef90240dc16"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.435380 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb8f502-8e11-4760-9bc2-2ef90240dc16-kube-api-access-zs6gz" (OuterVolumeSpecName: "kube-api-access-zs6gz") pod "8bb8f502-8e11-4760-9bc2-2ef90240dc16" (UID: "8bb8f502-8e11-4760-9bc2-2ef90240dc16"). InnerVolumeSpecName "kube-api-access-zs6gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.435806 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb8f502-8e11-4760-9bc2-2ef90240dc16-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8bb8f502-8e11-4760-9bc2-2ef90240dc16" (UID: "8bb8f502-8e11-4760-9bc2-2ef90240dc16"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.531674 4842 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bb8f502-8e11-4760-9bc2-2ef90240dc16-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.531724 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs6gz\" (UniqueName: \"kubernetes.io/projected/8bb8f502-8e11-4760-9bc2-2ef90240dc16-kube-api-access-zs6gz\") on node \"crc\" DevicePath \"\"" Mar 11 19:30:03 crc kubenswrapper[4842]: I0311 19:30:03.531739 4842 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bb8f502-8e11-4760-9bc2-2ef90240dc16-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.000156 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" event={"ID":"8bb8f502-8e11-4760-9bc2-2ef90240dc16","Type":"ContainerDied","Data":"0a2011b5a1a46189d6ec53ad16dd0229c70a97f402eb9e2a5fd68d1ac65ea348"} Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.000227 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a2011b5a1a46189d6ec53ad16dd0229c70a97f402eb9e2a5fd68d1ac65ea348" Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.000296 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29554290-g45xp" Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.336316 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554290-8gsxw" Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.342955 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt"] Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.350856 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29554245-ch5xt"] Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.447629 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rxsn\" (UniqueName: \"kubernetes.io/projected/b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63-kube-api-access-8rxsn\") pod \"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63\" (UID: \"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63\") " Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.451607 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63-kube-api-access-8rxsn" (OuterVolumeSpecName: "kube-api-access-8rxsn") pod "b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63" (UID: "b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63"). InnerVolumeSpecName "kube-api-access-8rxsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.549940 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rxsn\" (UniqueName: \"kubernetes.io/projected/b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63-kube-api-access-8rxsn\") on node \"crc\" DevicePath \"\"" Mar 11 19:30:04 crc kubenswrapper[4842]: I0311 19:30:04.973347 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2bd84be-1f01-47e4-a35e-4ed993d4be9b" path="/var/lib/kubelet/pods/a2bd84be-1f01-47e4-a35e-4ed993d4be9b/volumes" Mar 11 19:30:05 crc kubenswrapper[4842]: I0311 19:30:05.011358 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554290-8gsxw" event={"ID":"b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63","Type":"ContainerDied","Data":"87f910e5f0b817e98bb917756d92f070a0cabdc6a23e577ef86f0857c31dcb98"} Mar 11 19:30:05 crc kubenswrapper[4842]: I0311 19:30:05.011444 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f910e5f0b817e98bb917756d92f070a0cabdc6a23e577ef86f0857c31dcb98" Mar 11 19:30:05 crc kubenswrapper[4842]: I0311 19:30:05.011598 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554290-8gsxw" Mar 11 19:30:05 crc kubenswrapper[4842]: I0311 19:30:05.388620 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554284-9c588"] Mar 11 19:30:05 crc kubenswrapper[4842]: I0311 19:30:05.395389 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554284-9c588"] Mar 11 19:30:06 crc kubenswrapper[4842]: I0311 19:30:06.973608 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6827f555-4985-4131-8bc6-2df2bc76ed73" path="/var/lib/kubelet/pods/6827f555-4985-4131-8bc6-2df2bc76ed73/volumes" Mar 11 19:30:11 crc kubenswrapper[4842]: I0311 19:30:11.961930 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:30:11 crc kubenswrapper[4842]: E0311 19:30:11.962789 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:30:23 crc kubenswrapper[4842]: I0311 19:30:23.963312 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:30:23 crc kubenswrapper[4842]: E0311 19:30:23.965136 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:30:33 crc kubenswrapper[4842]: I0311 19:30:33.772771 4842 scope.go:117] "RemoveContainer" containerID="f0c71b5e37069aa7fd05e8a991e02e22b32f744273fdbe6f859af7480faf366a" Mar 11 19:30:33 crc kubenswrapper[4842]: I0311 19:30:33.799521 4842 scope.go:117] "RemoveContainer" containerID="7c0a3d1e62e5ae48da938896416b422a3b9be23d54e2edae443225cfc43d93af" Mar 11 19:30:35 crc kubenswrapper[4842]: I0311 19:30:35.962873 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:30:35 crc kubenswrapper[4842]: E0311 19:30:35.963843 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:30:47 crc kubenswrapper[4842]: I0311 19:30:47.962248 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:30:47 crc kubenswrapper[4842]: E0311 19:30:47.962912 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:31:00 crc kubenswrapper[4842]: I0311 19:31:00.962350 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:31:00 crc kubenswrapper[4842]: E0311 19:31:00.963204 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:31:13 crc kubenswrapper[4842]: I0311 19:31:13.961872 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:31:13 crc kubenswrapper[4842]: E0311 19:31:13.962786 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:31:26 crc kubenswrapper[4842]: I0311 19:31:26.961989 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:31:26 crc kubenswrapper[4842]: E0311 19:31:26.962710 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-csjgs_openshift-machine-config-operator(12f22b8b-b227-48b3-b1f1-322dfe40e383)\"" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" Mar 11 19:31:39 crc kubenswrapper[4842]: I0311 19:31:39.962211 4842 scope.go:117] "RemoveContainer" containerID="68c41e54d0389afd75ca3c4cfec7a5d99f4dc79b4b94f253275814fc5b5f1a92" Mar 11 19:31:40 crc kubenswrapper[4842]: I0311 19:31:40.832413 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" event={"ID":"12f22b8b-b227-48b3-b1f1-322dfe40e383","Type":"ContainerStarted","Data":"84da163f8e511788d56229b4253cebd79d37bcb52cd4bfbdfcf9a7df55f20a2d"} Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.142748 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554292-dflb4"] Mar 11 19:32:00 crc kubenswrapper[4842]: E0311 19:32:00.144168 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb8f502-8e11-4760-9bc2-2ef90240dc16" containerName="collect-profiles" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.144188 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb8f502-8e11-4760-9bc2-2ef90240dc16" containerName="collect-profiles" Mar 11 19:32:00 crc kubenswrapper[4842]: E0311 19:32:00.144229 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63" containerName="oc" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.144244 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63" containerName="oc" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.145358 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ec4a8f-e0fd-447f-b6d0-67a0439f7b63" containerName="oc" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.145415 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb8f502-8e11-4760-9bc2-2ef90240dc16" containerName="collect-profiles" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.146482 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554292-dflb4" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.151218 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.151869 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.152016 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.163517 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554292-dflb4"] Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.186036 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlkmv\" (UniqueName: \"kubernetes.io/projected/361245ea-745b-4d1a-b73a-24dade787ead-kube-api-access-rlkmv\") pod \"auto-csr-approver-29554292-dflb4\" (UID: \"361245ea-745b-4d1a-b73a-24dade787ead\") " pod="openshift-infra/auto-csr-approver-29554292-dflb4" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.287173 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlkmv\" (UniqueName: \"kubernetes.io/projected/361245ea-745b-4d1a-b73a-24dade787ead-kube-api-access-rlkmv\") pod \"auto-csr-approver-29554292-dflb4\" (UID: \"361245ea-745b-4d1a-b73a-24dade787ead\") " pod="openshift-infra/auto-csr-approver-29554292-dflb4" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.308527 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlkmv\" (UniqueName: \"kubernetes.io/projected/361245ea-745b-4d1a-b73a-24dade787ead-kube-api-access-rlkmv\") pod \"auto-csr-approver-29554292-dflb4\" (UID: \"361245ea-745b-4d1a-b73a-24dade787ead\") " pod="openshift-infra/auto-csr-approver-29554292-dflb4" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.467678 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554292-dflb4" Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.931824 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554292-dflb4"] Mar 11 19:32:00 crc kubenswrapper[4842]: I0311 19:32:00.973390 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554292-dflb4" event={"ID":"361245ea-745b-4d1a-b73a-24dade787ead","Type":"ContainerStarted","Data":"3317591c29cdf35006801e9a152f4dab3462e9c9ad74b5b96b9b2a9d983a8183"} Mar 11 19:32:02 crc kubenswrapper[4842]: I0311 19:32:02.989028 4842 generic.go:334] "Generic (PLEG): container finished" podID="361245ea-745b-4d1a-b73a-24dade787ead" containerID="b8bfd2f0b2e289a8f305a0aa86b3b6400075852c093def49f5b895b0197b27e8" exitCode=0 Mar 11 19:32:02 crc kubenswrapper[4842]: I0311 19:32:02.989090 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554292-dflb4" event={"ID":"361245ea-745b-4d1a-b73a-24dade787ead","Type":"ContainerDied","Data":"b8bfd2f0b2e289a8f305a0aa86b3b6400075852c093def49f5b895b0197b27e8"} Mar 11 19:32:04 crc kubenswrapper[4842]: I0311 19:32:04.281146 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554292-dflb4" Mar 11 19:32:04 crc kubenswrapper[4842]: I0311 19:32:04.355038 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlkmv\" (UniqueName: \"kubernetes.io/projected/361245ea-745b-4d1a-b73a-24dade787ead-kube-api-access-rlkmv\") pod \"361245ea-745b-4d1a-b73a-24dade787ead\" (UID: \"361245ea-745b-4d1a-b73a-24dade787ead\") " Mar 11 19:32:04 crc kubenswrapper[4842]: I0311 19:32:04.360026 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361245ea-745b-4d1a-b73a-24dade787ead-kube-api-access-rlkmv" (OuterVolumeSpecName: "kube-api-access-rlkmv") pod "361245ea-745b-4d1a-b73a-24dade787ead" (UID: "361245ea-745b-4d1a-b73a-24dade787ead"). InnerVolumeSpecName "kube-api-access-rlkmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:32:04 crc kubenswrapper[4842]: I0311 19:32:04.458028 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlkmv\" (UniqueName: \"kubernetes.io/projected/361245ea-745b-4d1a-b73a-24dade787ead-kube-api-access-rlkmv\") on node \"crc\" DevicePath \"\"" Mar 11 19:32:05 crc kubenswrapper[4842]: I0311 19:32:05.007977 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554292-dflb4" event={"ID":"361245ea-745b-4d1a-b73a-24dade787ead","Type":"ContainerDied","Data":"3317591c29cdf35006801e9a152f4dab3462e9c9ad74b5b96b9b2a9d983a8183"} Mar 11 19:32:05 crc kubenswrapper[4842]: I0311 19:32:05.008032 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554292-dflb4" Mar 11 19:32:05 crc kubenswrapper[4842]: I0311 19:32:05.008027 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3317591c29cdf35006801e9a152f4dab3462e9c9ad74b5b96b9b2a9d983a8183" Mar 11 19:32:05 crc kubenswrapper[4842]: I0311 19:32:05.341304 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554286-vhj56"] Mar 11 19:32:05 crc kubenswrapper[4842]: I0311 19:32:05.346514 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554286-vhj56"] Mar 11 19:32:06 crc kubenswrapper[4842]: I0311 19:32:06.971648 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d5852b-6012-4a40-b7cd-c93401c6669f" path="/var/lib/kubelet/pods/69d5852b-6012-4a40-b7cd-c93401c6669f/volumes" Mar 11 19:32:33 crc kubenswrapper[4842]: I0311 19:32:33.896844 4842 scope.go:117] "RemoveContainer" containerID="01102a2d073002746d7c2d8097cfb7b19eabafa131a7e9f260837903bdd606c5" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.154418 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29554294-j9grf"] Mar 11 19:34:00 crc kubenswrapper[4842]: E0311 19:34:00.155738 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361245ea-745b-4d1a-b73a-24dade787ead" containerName="oc" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.155781 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="361245ea-745b-4d1a-b73a-24dade787ead" containerName="oc" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.156571 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="361245ea-745b-4d1a-b73a-24dade787ead" containerName="oc" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.157714 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554294-j9grf" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.161137 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.161262 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-gx5tm" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.161300 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.180615 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554294-j9grf"] Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.285146 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9k7l\" (UniqueName: \"kubernetes.io/projected/4e8ed95c-0459-40c7-a347-65e80a407f36-kube-api-access-b9k7l\") pod \"auto-csr-approver-29554294-j9grf\" (UID: \"4e8ed95c-0459-40c7-a347-65e80a407f36\") " pod="openshift-infra/auto-csr-approver-29554294-j9grf" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.386858 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9k7l\" (UniqueName: \"kubernetes.io/projected/4e8ed95c-0459-40c7-a347-65e80a407f36-kube-api-access-b9k7l\") pod \"auto-csr-approver-29554294-j9grf\" (UID: \"4e8ed95c-0459-40c7-a347-65e80a407f36\") " pod="openshift-infra/auto-csr-approver-29554294-j9grf" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.410754 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9k7l\" (UniqueName: \"kubernetes.io/projected/4e8ed95c-0459-40c7-a347-65e80a407f36-kube-api-access-b9k7l\") pod \"auto-csr-approver-29554294-j9grf\" (UID: \"4e8ed95c-0459-40c7-a347-65e80a407f36\") " pod="openshift-infra/auto-csr-approver-29554294-j9grf" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.487383 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554294-j9grf" Mar 11 19:34:00 crc kubenswrapper[4842]: I0311 19:34:00.991317 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29554294-j9grf"] Mar 11 19:34:01 crc kubenswrapper[4842]: I0311 19:34:01.472237 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:34:01 crc kubenswrapper[4842]: I0311 19:34:01.472321 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:34:01 crc kubenswrapper[4842]: I0311 19:34:01.858463 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554294-j9grf" event={"ID":"4e8ed95c-0459-40c7-a347-65e80a407f36","Type":"ContainerStarted","Data":"8aa69d0c65f7461ae3783f9a4d7ed8fbe5755f5dce6d69ac7f7e5fd1a55daf09"} Mar 11 19:34:02 crc kubenswrapper[4842]: I0311 19:34:02.867152 4842 generic.go:334] "Generic (PLEG): container finished" podID="4e8ed95c-0459-40c7-a347-65e80a407f36" containerID="bb7a6e703115036577e99bf26599f033bf3e1e2020b54ee89e73e5778fd662ed" exitCode=0 Mar 11 19:34:02 crc kubenswrapper[4842]: I0311 19:34:02.867253 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554294-j9grf" event={"ID":"4e8ed95c-0459-40c7-a347-65e80a407f36","Type":"ContainerDied","Data":"bb7a6e703115036577e99bf26599f033bf3e1e2020b54ee89e73e5778fd662ed"} Mar 11 19:34:04 crc kubenswrapper[4842]: I0311 19:34:04.200432 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554294-j9grf" Mar 11 19:34:04 crc kubenswrapper[4842]: I0311 19:34:04.252928 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9k7l\" (UniqueName: \"kubernetes.io/projected/4e8ed95c-0459-40c7-a347-65e80a407f36-kube-api-access-b9k7l\") pod \"4e8ed95c-0459-40c7-a347-65e80a407f36\" (UID: \"4e8ed95c-0459-40c7-a347-65e80a407f36\") " Mar 11 19:34:04 crc kubenswrapper[4842]: I0311 19:34:04.258595 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8ed95c-0459-40c7-a347-65e80a407f36-kube-api-access-b9k7l" (OuterVolumeSpecName: "kube-api-access-b9k7l") pod "4e8ed95c-0459-40c7-a347-65e80a407f36" (UID: "4e8ed95c-0459-40c7-a347-65e80a407f36"). InnerVolumeSpecName "kube-api-access-b9k7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 19:34:04 crc kubenswrapper[4842]: I0311 19:34:04.354495 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9k7l\" (UniqueName: \"kubernetes.io/projected/4e8ed95c-0459-40c7-a347-65e80a407f36-kube-api-access-b9k7l\") on node \"crc\" DevicePath \"\"" Mar 11 19:34:04 crc kubenswrapper[4842]: I0311 19:34:04.904952 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29554294-j9grf" event={"ID":"4e8ed95c-0459-40c7-a347-65e80a407f36","Type":"ContainerDied","Data":"8aa69d0c65f7461ae3783f9a4d7ed8fbe5755f5dce6d69ac7f7e5fd1a55daf09"} Mar 11 19:34:04 crc kubenswrapper[4842]: I0311 19:34:04.905006 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa69d0c65f7461ae3783f9a4d7ed8fbe5755f5dce6d69ac7f7e5fd1a55daf09" Mar 11 19:34:04 crc kubenswrapper[4842]: I0311 19:34:04.905040 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29554294-j9grf" Mar 11 19:34:05 crc kubenswrapper[4842]: I0311 19:34:05.271388 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29554288-qlws8"] Mar 11 19:34:05 crc kubenswrapper[4842]: I0311 19:34:05.278233 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29554288-qlws8"] Mar 11 19:34:06 crc kubenswrapper[4842]: I0311 19:34:06.970774 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a41747-fa41-48aa-b29a-715511817938" path="/var/lib/kubelet/pods/70a41747-fa41-48aa-b29a-715511817938/volumes" Mar 11 19:34:31 crc kubenswrapper[4842]: I0311 19:34:31.471331 4842 patch_prober.go:28] interesting pod/machine-config-daemon-csjgs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 19:34:31 crc kubenswrapper[4842]: I0311 19:34:31.471846 4842 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-csjgs" podUID="12f22b8b-b227-48b3-b1f1-322dfe40e383" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 19:34:33 crc kubenswrapper[4842]: I0311 19:34:33.989634 4842 scope.go:117] "RemoveContainer" containerID="f04295390938f31957090ffd7a011c0bd93dcb6109ac9c04dca44b3965d72694" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.773706 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s5ztx"] Mar 11 19:34:37 crc kubenswrapper[4842]: E0311 19:34:37.774855 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8ed95c-0459-40c7-a347-65e80a407f36" containerName="oc" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.774873 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8ed95c-0459-40c7-a347-65e80a407f36" containerName="oc" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.775106 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8ed95c-0459-40c7-a347-65e80a407f36" containerName="oc" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.777227 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.783910 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5ztx"] Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.845602 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-utilities\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.845927 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dmq\" (UniqueName: \"kubernetes.io/projected/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-kube-api-access-t8dmq\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.846104 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-catalog-content\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.947757 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dmq\" (UniqueName: \"kubernetes.io/projected/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-kube-api-access-t8dmq\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.948070 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-catalog-content\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.948300 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-utilities\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.948693 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-catalog-content\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.948748 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-utilities\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:37 crc kubenswrapper[4842]: I0311 19:34:37.968989 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dmq\" (UniqueName: \"kubernetes.io/projected/f9958ea1-ab08-4749-a8d0-2371ac1e0c41-kube-api-access-t8dmq\") pod \"redhat-marketplace-s5ztx\" (UID: \"f9958ea1-ab08-4749-a8d0-2371ac1e0c41\") " pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:38 crc kubenswrapper[4842]: I0311 19:34:38.098560 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:38 crc kubenswrapper[4842]: I0311 19:34:38.624043 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5ztx"] Mar 11 19:34:39 crc kubenswrapper[4842]: I0311 19:34:39.219173 4842 generic.go:334] "Generic (PLEG): container finished" podID="f9958ea1-ab08-4749-a8d0-2371ac1e0c41" containerID="4959685dfcfc19be6d4f210be453728bda25a184534a3abe22c4194c20c87b5e" exitCode=0 Mar 11 19:34:39 crc kubenswrapper[4842]: I0311 19:34:39.219389 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5ztx" event={"ID":"f9958ea1-ab08-4749-a8d0-2371ac1e0c41","Type":"ContainerDied","Data":"4959685dfcfc19be6d4f210be453728bda25a184534a3abe22c4194c20c87b5e"} Mar 11 19:34:39 crc kubenswrapper[4842]: I0311 19:34:39.219510 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5ztx" event={"ID":"f9958ea1-ab08-4749-a8d0-2371ac1e0c41","Type":"ContainerStarted","Data":"7c91c68377caa24bdb1033cb09d8d66d2742cb41131bd1e69e8315be7821e85c"} Mar 11 19:34:40 crc kubenswrapper[4842]: I0311 19:34:40.232088 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5ztx" event={"ID":"f9958ea1-ab08-4749-a8d0-2371ac1e0c41","Type":"ContainerStarted","Data":"29937c018f3e7262ef5c3914a3243c21521d2bf52d514caabbe61aeb88c2a870"} Mar 11 19:34:41 crc kubenswrapper[4842]: I0311 19:34:41.240769 4842 generic.go:334] "Generic (PLEG): container finished" podID="f9958ea1-ab08-4749-a8d0-2371ac1e0c41" containerID="29937c018f3e7262ef5c3914a3243c21521d2bf52d514caabbe61aeb88c2a870" exitCode=0 Mar 11 19:34:41 crc kubenswrapper[4842]: I0311 19:34:41.240831 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5ztx" event={"ID":"f9958ea1-ab08-4749-a8d0-2371ac1e0c41","Type":"ContainerDied","Data":"29937c018f3e7262ef5c3914a3243c21521d2bf52d514caabbe61aeb88c2a870"} Mar 11 19:34:42 crc kubenswrapper[4842]: I0311 19:34:42.252211 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5ztx" event={"ID":"f9958ea1-ab08-4749-a8d0-2371ac1e0c41","Type":"ContainerStarted","Data":"0ba003b0b5d5df33e91125ef8f0a6f83a2bfa973863b4afd3f7ce0dcec1db149"} Mar 11 19:34:42 crc kubenswrapper[4842]: I0311 19:34:42.282102 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s5ztx" podStartSLOduration=2.616570727 podStartE2EDuration="5.282083729s" podCreationTimestamp="2026-03-11 19:34:37 +0000 UTC" firstStartedPulling="2026-03-11 19:34:39.220736296 +0000 UTC m=+2724.868432576" lastFinishedPulling="2026-03-11 19:34:41.886249298 +0000 UTC m=+2727.533945578" observedRunningTime="2026-03-11 19:34:42.276988484 +0000 UTC m=+2727.924684764" watchObservedRunningTime="2026-03-11 19:34:42.282083729 +0000 UTC m=+2727.929780009" Mar 11 19:34:48 crc kubenswrapper[4842]: I0311 19:34:48.100669 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:48 crc kubenswrapper[4842]: I0311 19:34:48.102570 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:48 crc kubenswrapper[4842]: I0311 19:34:48.146143 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:48 crc kubenswrapper[4842]: I0311 19:34:48.354045 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s5ztx" Mar 11 19:34:48 crc kubenswrapper[4842]: I0311 19:34:48.400813 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5ztx"] var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515154341742024453 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015154341743017371 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015154334145016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015154334145015462 5ustar corecore